Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Sep 13;12(9):e0183591.
doi: 10.1371/journal.pone.0183591. eCollection 2017.

A checklist is associated with increased quality of reporting preclinical biomedical research: A systematic review

Affiliations

A checklist is associated with increased quality of reporting preclinical biomedical research: A systematic review

SeungHye Han et al. PLoS One. .

Abstract

Irreproducibility of preclinical biomedical research has gained recent attention. It is suggested that requiring authors to complete a checklist at the time of manuscript submission would improve the quality and transparency of scientific reporting, and ultimately enhance reproducibility. Whether a checklist enhances quality and transparency in reporting preclinical animal studies, however, has not been empirically studied. Here we searched two highly cited life science journals, one that requires a checklist at submission (Nature) and one that does not (Cell), to identify in vivo animal studies. After screening 943 articles, a total of 80 articles were identified in 2013 (pre-checklist) and 2015 (post-checklist), and included for the detailed evaluation of reporting methodological and analytical information. We compared the quality of reporting preclinical animal studies between the two journals, accounting for differences between journals and changes over time in reporting. We find that reporting of randomization, blinding, and sample-size estimation significantly improved when comparing Nature to Cell from 2013 to 2015, likely due to implementation of a checklist. Specifically, improvement in reporting of the three methodological information was at least three times greater when a mandatory checklist was implemented than when it was not. Reporting the sex of animals and the number of independent experiments performed also improved from 2013 to 2015, likely from factors not related to a checklist. Our study demonstrates that completing a checklist at manuscript submission is associated with improved reporting of key methodological information in preclinical animal studies.

PubMed Disclaimer

Conflict of interest statement

Competing Interests: The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Outline of the study.
(A) Selection of articles: Twenty consecutive articles that met the inclusion criteria among those published beginning in January for both 2013 and 2015 in Nature (one that implemented a pre-submission checklist) and Cell (one that did not) journals. This represents articles from periods of time before and after the implementation of the checklist in May 2013. (B) Flow of the analysis: To examine whether quality of reporting has improved over time, the degree of key information reported in 2015 was compared to that in 2013 in both journals combined (Objective 1). To assess whether a checklist is associated with improved quality in reporting, we first compared the changes over time observed in Nature (④ vs. ③). If there was significant difference, we compared time “2015 vs. 2013” in Cell (② vs. ①) and Nature vs. Cell within 2013 (③ vs. ①) and 2015 (④ vs. ②) to adjust for differences between journals and changes over time in reporting (Objective 2).
Fig 2
Fig 2. Distribution of reporting study designs across time.
The distributions of the reporting status are presented in stacked bar graphs. The numbers inside the stacks are the number of articles corresponding to each percentage. The data for 2013 and 2015 are the total numbers of articles assessed from Cell and Nature within a given year. Fisher exact test was performed to assess the difference in reporting each methodological across time. Significant P values (< 0.05) are provided.
Fig 3
Fig 3. The changes in rigorous reporting of study designs by a checklist.
The numbers inside the pie charts are the number of articles corresponding to each category. P values < 0.10 using Fisher exact test are provided to compare time 2015 vs. 2013 within the intervention (Nature) or the comparison (Cell) group, or to compare intervention vs comparison group within 2013 and 2015, respectively. ≠ is shown where comparisons between the touching two groups are significantly different with P < 0.05.

References

    1. Contopoulos-Ioannidis DG, Ntzani E, Ioannidis JP. Translation of highly promising basic science research into clinical applications. Am J Med. 2003;114(6):477–84. . - PubMed
    1. Prinz F, Schlange T, Asadullah K. Believe it or not: how much can we rely on published data on potential drug targets? Nat Rev Drug Discov. 2011;10(9):712 doi: 10.1038/nrd3439-c1 . - DOI - PubMed
    1. Begley CG, Ellis LM. Drug development: Raise standards for preclinical cancer research. Nature. 2012;483(7391):531–3. doi: 10.1038/483531a . - DOI - PubMed
    1. Casadevall A, Fang FC. Reproducible science. Infect Immun. 2010;78(12):4972–5. doi: 10.1128/IAI.00908-10 ; - DOI - PMC - PubMed
    1. Peng RD. Reproducible research and Biostatistics. Biostatistics. 2009;10(3):405–8. doi: 10.1093/biostatistics/kxp014 . - DOI - PubMed

Publication types

LinkOut - more resources