Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2019 May 23;14(5):e0215221.
doi: 10.1371/journal.pone.0215221. eCollection 2019.

Reporting preclinical anesthesia study (REPEAT): Evaluating the quality of reporting in the preclinical anesthesiology literature

Affiliations
Review

Reporting preclinical anesthesia study (REPEAT): Evaluating the quality of reporting in the preclinical anesthesiology literature

Dean A Fergusson et al. PLoS One. .

Abstract

Poor reporting quality may contribute to irreproducibility of results and failed 'bench-to-bedside' translation. Consequently, guidelines have been developed to improve the complete and transparent reporting of in vivo preclinical studies. To examine the impact of such guidelines on core methodological and analytical reporting items in the preclinical anesthesiology literature, we sampled a cohort of studies. Preclinical in vivo studies published in Anesthesiology, Anesthesia & Analgesia, Anaesthesia, and the British Journal of Anaesthesia (2008-2009, 2014-2016) were identified. Data was extracted independently and in duplicate. Reporting completeness was assessed using the National Institutes of Health Principles and Guidelines for Reporting Preclinical Research. Risk ratios were used for comparative analyses. Of 7615 screened articles, 604 met our inclusion criteria and included experiments reporting on 52 490 animals. The most common topic of investigation was pain and analgesia (30%), rodents were most frequently used (77%), and studies were most commonly conducted in the United States (36%). Use of preclinical reporting guidelines was listed in 10% of applicable articles. A minority of studies fully reported on replicates (0.3%), randomization (10%), blinding (12%), sample-size estimation (3%), and inclusion/exclusion criteria (5%). Statistics were well reported (81%). Comparative analysis demonstrated few differences in reporting rigor between journals, including those that endorsed reporting guidelines. Principal items of study design were infrequently reported, with few differences between journals. Methods to improve implementation and adherence to community-based reporting guidelines may be necessary to increase transparent and consistent reporting in the preclinical anesthesiology literature.

PubMed Disclaimer

Conflict of interest statement

The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Constructing our reporting checklist.
The National Institutes of Health preclinical reporting guidelines (NIH-PRG) consist of seven domains, each containing a multi-faceted recommendation. This recommendation for the domain of blinding was deconstructed and two unidimensional items were identified.
Fig 2
Fig 2. Preferred reporting items for systematic reviews and meta-analyses (PRISMA [24]) study selection diagram.
Fig 3
Fig 3. Distribution of publications.
World map depicting the number of articles published per country based on the corresponding author’s residency at the time of publication (image created using Tableau Software; Seattle, Washington, United States).
Fig 4
Fig 4. Reporting assessment results.
Completeness of reporting across all included studies (N = 604) against the deconstructed NIH-PRG. The data is displayed by item in each domain as a frequency (n), and as a percentage (n/N), where black and white correspond to an item being reported or not reported, respectively.

Similar articles

Cited by

References

    1. Contopoulos-Ioannidis DG, Ntzani E, Ioannidis JP. Translation of highly promising basic science research into clinical applications. Am J Med. 2003; 114(6):477–84. - PubMed
    1. Kola I, Landis J. Can the pharmaceutical industry reduce attrition rates? Nat Rev Drug Discov. 2004; 3(8):711–5. 10.1038/nrd1470 - DOI - PubMed
    1. Hackam DG, Redelmeier DA. Translation of research evidence from animals to humans. JAMA. 2006; 296(14):1727–32. - PubMed
    1. Landis SC, Amara SG, Asadullah K, Austin CP, Blumenstein R, Bradley EW, et al. A call for transparent reporting to optimize the predictive value of preclinical research. Nature. 2012; 490(7419):187–91. 10.1038/nature11556 - DOI - PMC - PubMed
    1. Sandve GK, Nekrutenko A, Taylor J, Hovig E. Ten simple rules for reproducible computational research. PLoS Comput Biol. 2013; 9(10):e1003285 10.1371/journal.pcbi.1003285 - DOI - PMC - PubMed