Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Meta-Analysis
. 2016 Apr 22;17(1):207.
doi: 10.1186/s13063-016-1327-z.

Reporting of harms outcomes: a comparison of journal publications with unpublished clinical study reports of orlistat trials

Affiliations
Meta-Analysis

Reporting of harms outcomes: a comparison of journal publications with unpublished clinical study reports of orlistat trials

Alex Hodkinson et al. Trials. .

Abstract

Background: The quality of harms reporting in journal publications is often poor, which can impede the risk-benefit interpretation of a clinical trial. Clinical study reports can provide more reliable, complete, and informative data on harms compared to the corresponding journal publication. This case study compares the quality and quantity of harms data reported in journal publications and clinical study reports of orlistat trials.

Methods: Publications related to clinical trials of orlistat were identified through comprehensive literature searches. A request was made to Roche (Genentech; South San Francisco, CA, USA) for clinical study reports related to the orlistat trials identified in our search. We compared adverse events, serious adverse events, and the reporting of 15 harms criteria in both document types and compared meta-analytic results using data from the clinical study reports against the journal publications.

Results: Five journal publications with matching clinical study reports were available for five independent clinical trials. Journal publications did not always report the complete list of identified adverse events and serious adverse events. We found some differences in the magnitude of the pooled risk difference between both document types with a statistically significant risk difference for three adverse events and two serious adverse events using data reported in the clinical study reports; these events were of mild intensity and unrelated to the orlistat. The CONSORT harms reporting criteria were often satisfied in the methods section of the clinical study reports (70-90 % of the methods section criteria satisfied in the clinical study reports compared to 10-50 % in the journal publications), but both document types satisfied 80-100 % of the results section criteria, albeit with greater detail being provided in the clinical study reports.

Conclusions: In this case study, journal publications provided insufficient information on harms outcomes of clinical trials and did not specify that a subset of harms data were being presented. Clinical study reports often present data on harms, including serious adverse events, which are not reported or mentioned in the journal publications. Therefore, clinical study reports could support a more complete, accurate, and reliable investigation, and researchers undertaking evidence synthesis of harm outcomes should not rely only on incomplete published data that are presented in the journal publications.

Keywords: Adverse effect; Adverse event; Clinical study report; Evidence-based healthcare; Harms; Obesity; Orlistat; Randomised controlled trial; Systematic review.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1
Flow diagram for obtaining the trial reports
Fig. 2
Fig. 2
The total number of MedDRA preferred terms (Adverse Events) reported in clinical study reports (CSRs) and journal publications across all five trials. Footnote: Total: Total number of individual MedDRA preferred terms related to AEs reported across the CSR and journal publication for a trial
Fig. 3
Fig. 3
The total number of serious adverse events reported in the clinical study reports (CSRs) and journal publications across all five trials. Footnote: Total: Total number of individual MedDRA preferred terms related to SAEs reported across the CSR and journal publication for a trial

References

    1. McGauran N, Wieseler B, Kreis J, Schüler YB, Kölsch H, Kaiser T. Reporting bias in medical research - a narrative review. Trials. 2010;11:37. doi: 10.1186/1745-6215-11-37. - DOI - PMC - PubMed
    1. Vedula SS, Tianjing L, Dickersin K. Differences in reporting of analyses in internal company documents versus published trial reports: comparisons in industry-sponsored trials in off-label uses of gabapentin. PLoS Med. 2013;10(1):1–13. doi: 10.1371/journal.pmed.1001378. - DOI - PMC - PubMed
    1. Song F, Parekh S, Hooper L, Loke YK, Ryder J, Sutton AJ, et al. Dissemination and publication of research findings: an updated review of related biases. Health Technol Assess. 2010;14(8):iii. doi: 10.3310/hta14080. - DOI - PubMed
    1. Wieseler B, Wolfram N, McGauran N, Kerekes MF, Vervölgyi V, Kohlepp P. Completeness of reporting of patient-relevant clinical trial outcomes: comparison of unpublished clinical study reports with publicly available data. PLoS Med. 2013;10(10):e1001526. doi: 10.1371/journal.pmed.1001526. - DOI - PMC - PubMed
    1. Moher D, Jones A, Lepage L. CONSORT Group. Use of the CONSORT statement and quality of reports of randomized trials: a comparative before-and-after evaluation. JAMA. 2001;285(15):1992–5. doi: 10.1001/jama.285.15.1992. - DOI - PubMed

Publication types