Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Jul 15;24(14):3447-3455.
doi: 10.1158/1078-0432.CCR-18-0227. Epub 2018 Apr 11.

A Survey on Data Reproducibility and the Effect of Publication Process on the Ethical Reporting of Laboratory Research

Affiliations

A Survey on Data Reproducibility and the Effect of Publication Process on the Ethical Reporting of Laboratory Research

Delphine R Boulbes et al. Clin Cancer Res. .

Abstract

Purpose: The successful translation of laboratory research into effective therapies is dependent upon the validity of peer-reviewed publications. However, several publications in recent years suggested that published scientific findings could be reproduced only 11% to 45% of the time. Multiple surveys attempted to elucidate the fundamental causes of data irreproducibility and underscored potential solutions, more robust experimental designs, better statistics, and better mentorship. However, no prior survey has addressed the role of the review and publication process on honest reporting.Experimental Design: We developed an anonymous online survey intended for trainees involved in bench research. The survey included questions related to mentoring/career development, research practice, integrity, and transparency, and how the pressure to publish and the publication process itself influence their reporting practices.Results: Responses to questions related to mentoring and training practices were largely positive, although an average of approximately 25% did not seem to receive optimal mentoring. A total of 39.2% revealed having been pressured by a principle investigator or collaborator to produce "positive" data. About 62.8% admitted that the pressure to publish influences the way they report data. The majority of respondents did not believe that extensive revisions significantly improved the manuscript while adding to the cost and time invested.Conclusions: This survey indicates that trainees believe that the pressure to publish affects honest reporting, mostly emanating from our system of rewards and advancement. The publication process itself affects faculty and trainees and appears to influence a shift in their ethics from honest reporting ("negative data") to selective reporting, data falsification, or even fabrication. Clin Cancer Res; 24(14); 3447-55. ©2018 AACR.

PubMed Disclaimer

Conflict of interest statement

The authors declare no potential conflicts of interest.

Figures

Fig 1
Fig 1. Responses to questions about mentoring supervision from 467 respondents
A. Responses to question 6. Comments provided in response to “other, please explain” were either extrapolated to fit into one of the original choices or used to create the new categories of “twice per month,” “daily,” “as needed,” and “rarely.” B. Responses to question 7. Comments provided in response to “other, please explain” were either extrapolated to fit into one of the original choices or used to create the new categories of “monthly,” “yearly,” “variable,” “never,” and “not applicable.” C. Responses to question 8a. The 220 respondents who said they felt pressured to provide “positive” data (question 8) indicated that the pressure came either from a principal investigator PI, a colleague/collaborator or was self-induced. Because several answers could be selected, the data are shown as absolute values of all responses selected (359 total).
Fig 2
Fig 2. Responses to questions about best research practices
A. Responses to question 13. Among the 465 respondents, 112 responded “other” and explained it as “not applicable.” This graph represents the responses of the 353 other respondents. B. Responses to question 14. Among the 465 respondents, 121 responded “other” and explained it as “not applicable.” This graph represents the responses of the 344 other respondents. C. Responses to question 18. Among the 467 respondents, 14 responded “other” and explained it as “not applicable.” This graph represents the responses of the 453 other respondents. D. Responses to question 12. When the answer given was “other, please explain,” the comments provided by these respondents were either extrapolated to fit into one of the original categories or used create the new category “It depends on the study.” All 467 respondents answered this question, but 15 of the “other” responses did not fit into any of the existing categories or the new category. This graph represents the responses of 452 other respondents.
Fig 3
Fig 3. Responses to questions about research integrity and transparency
Responses were provided by all 467 respondents to questions 5 (A), 27 (B), 10 (C), and 11 (D).
Fig 4
Fig 4. Responses to questions about the publication process
A. Responses to question 22. Among the 467 respondents, 285 responded “other” and explained it as “not applicable.” This graph represents the responses of the 182 other respondents. Because several answers could be selected, the data are shown as absolute values of all responses selected (220 total). B. Responses to question 23. Among the 467 respondents, 43 responded “other” and explained it as “not applicable.” This graph represents the responses of the 424 other respondents. Because several answers could be selected, the data are shown as absolute values of all responses selected (506 total). C. Responses to question 26. Among the 454 respondents, 206 responded either “not applicable” or “don't have access to information necessary to provide an estimate.” This graph represents the responses of the 248 other respondents. D. Responses to question 25. Among the 467 respondents, 53 responded “other” and explained it as “not applicable.” This graph represents the responses of the 414 other respondents. Because several answers could be selected, the data are shown as absolute values of all responses selected (470 total).

References

    1. Begley CG, Ellis LM. Drug development: Raise standards for preclinical cancer research. Nature. 2012;483(7391):531–3. doi: 10.1038/483531a. - DOI - PubMed
    1. Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10(9):712. doi: 10.1038/nrd3439-c1. - DOI - PubMed
    1. Baker M, Dolgin E. Cancer reproducibility project releases first results. Nature. 2017;541(7637):269–70. doi: 10.1038/541269a. - DOI - PubMed
    1. Mobley A, Linder SK, Braeuer R, Ellis LM, Zwelling L. A survey on data reproducibility in cancer research provides insights into our limited ability to translate findings from the laboratory to the clinic. PLoS One. 2013;8(5):e63221. doi: 10.1371/journal.pone.0063221. - DOI - PMC - PubMed
    1. Baker M. 1,500 scientists lift the lid on reproducibility. Nature. 2016;533(7604):452–4. doi: 10.1038/533452a. - DOI - PubMed

Publication types

LinkOut - more resources