Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Meta-Analysis
. 2023 Jul 11:382:e075767.
doi: 10.1136/bmj-2023-075767.

Prevalence and predictors of data and code sharing in the medical and health sciences: systematic review with meta-analysis of individual participant data

Affiliations
Meta-Analysis

Prevalence and predictors of data and code sharing in the medical and health sciences: systematic review with meta-analysis of individual participant data

Daniel G Hamilton et al. BMJ. .

Abstract

Objectives: To synthesise research investigating data and code sharing in medicine and health to establish an accurate representation of the prevalence of sharing, how this frequency has changed over time, and what factors influence availability.

Design: Systematic review with meta-analysis of individual participant data.

Data sources: Ovid Medline, Ovid Embase, and the preprint servers medRxiv, bioRxiv, and MetaArXiv were searched from inception to 1 July 2021. Forward citation searches were also performed on 30 August 2022.

Review methods: Meta-research studies that investigated data or code sharing across a sample of scientific articles presenting original medical and health research were identified. Two authors screened records, assessed the risk of bias, and extracted summary data from study reports when individual participant data could not be retrieved. Key outcomes of interest were the prevalence of statements that declared that data or code were publicly or privately available (declared availability) and the success rates of retrieving these products (actual availability). The associations between data and code availability and several factors (eg, journal policy, type of data, trial design, and human participants) were also examined. A two stage approach to meta-analysis of individual participant data was performed, with proportions and risk ratios pooled with the Hartung-Knapp-Sidik-Jonkman method for random effects meta-analysis.

Results: The review included 105 meta-research studies examining 2 121 580 articles across 31 specialties. Eligible studies examined a median of 195 primary articles (interquartile range 113-475), with a median publication year of 2015 (interquartile range 2012-2018). Only eight studies (8%) were classified as having a low risk of bias. Meta-analyses showed a prevalence of declared and actual public data availability of 8% (95% confidence interval 5% to 11%) and 2% (1% to 3%), respectively, between 2016 and 2021. For public code sharing, both the prevalence of declared and actual availability were estimated to be <0.5% since 2016. Meta-regressions indicated that only declared public data sharing prevalence estimates have increased over time. Compliance with mandatory data sharing policies ranged from 0% to 100% across journals and varied by type of data. In contrast, success in privately obtaining data and code from authors historically ranged between 0% and 37% and 0% and 23%, respectively.

Conclusions: The review found that public code sharing was persistently low across medical research. Declarations of data sharing were also low, increasing over time, but did not always correspond to actual sharing of data. The effectiveness of mandatory data sharing policies varied substantially by journal and type of data, a finding that might be informative for policy makers when designing policies and allocating resources to audit compliance.

Systematic review registration: Open Science Framework doi:10.17605/OSF.IO/7SX8U.

PubMed Disclaimer

Conflict of interest statement

Competing interests: All authors have completed the ICMJE uniform disclosure form at https://www.icmje.org/disclosure-of-interest/ and declare: support from the Australian Research Council during the conduct of this research for the submitted work; some authors had support from research institutions listed in the funding statement; some authors received financial support to attend an international congress to present preliminary results of this project; no financial relationships with any organisations that might have an interest in the submitted work in the previous three years; no other relationships or activities that could appear to have influenced the submitted work.

Figures

Fig 1
Fig 1
Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) 2020 and PRISMA-individual participant data (IPD) flow diagram. *Forward citation search was performed on 30 August 2022. †Aggregate data were derived from partial IPD, reports, or authors. ‡Number of observations does not account for the potential presence of duplicate or non-medical primary articles. §Number of observations accounts for non-medical primary articles and duplicate primary articles within, but not between, meta-research studies. ¶Number of observations accounts for non-medical primary articles and both duplicate primary articles within and between meta-research studies. IRB=institutional review board; o=number of primary articles.
Fig 2
Fig 2
Prevalence of declared public data sharing between 2016 and 2021. ROB=risk of bias; GLMM=generalised linear mixed model; HKSJ=Hartung-Knapp-Sidik-Jonkman; IPD=individual participant data. Serghiou 2021a and 2021b refer to the manual and automated assessments, respectively, reported in Serghiou et al25
Fig 3
Fig 3
Prevalence of actual public data sharing between 2016 and 2021. ROB=risk of bias; GLMM=generalised linear mixed model; HKSJ=Hartung-Knapp-Sidik-Jonkman; IPD=individual participant data. Serghiou 2021a refers to the manual assessments reported in Serghiou et al25
Fig 4
Fig 4
Prevalence of declared public code sharing between 2016 and 2021. ROB=risk of bias; GLMM=generalised linear mixed model; HKSJ=Hartung-Knapp-Sidik-Jonkman; IPD=individual participant data. Serghiou 2021a and 2021b refer to the manual and automated assessments, respectively, reported in Serghiou et al25
Fig 5
Fig 5
Prevalence of actual public code sharing between 2016 and 2021. ROB=risk of bias; GLMM=generalised linear mixed model; HKSJ=Hartung-Knapp-Sidik-Jonkman; IPD=individual participant data
Fig 6
Fig 6
Prevalence of successful responses to private requests for data and code from published medical research. SEM=structural equation modelling; NA=summary data not available; IPD=individual participant data
Fig 7
Fig 7
Association between trial design and prevalence of declared public data sharing. BGLMM=bivariate generalised linear mixed model; HKSJ=Hartung-Knapp-Sidik-Jonkman; IPD=individual participant data. Serghiou 2021a refers to the manual assessments reported in Serghiou et al25
Fig 8
Fig 8
Association between trial design and prevalence of actual public data sharing. BGLMM=bivariate generalised linear mixed model; HKSJ=Hartung-Knapp-Sidik-Jonkman; IPD=individual participant data. Serghiou 2021a refers to the manual assessments reported in Serghiou et al25
Fig 9
Fig 9
Association between type of research participant and prevalence of declared public data sharing. ROB=risk of bias; BGLMM=bivariate generalised linear mixed model; HKSJ=Hartung-Knapp-Sidik-Jonkman; IPD=individual participant data
Fig 10
Fig 10
Association between type of research participant and prevalence of actual public data sharing. ROB=risk of bias; BGLMM=bivariate generalised linear mixed model; HKSJ=Hartung-Knapp-Sidik-Jonkman; IPD=individual participant data
Fig 11
Fig 11
Bubble plot of the prevalence of declared (top) and actual (bottom) data sharing by publication year with fitted meta-regression lines, 95% confidence intervals (dark purple shaded area), and 95% prediction intervals (light purple shaded area). Circles are scaled relative to the natural log of the sample size
None

Similar articles

Cited by

References

    1. Shahin MH, Bhattacharya S, Silva D, et al. . Open Data Revolution in Clinical Research: Opportunities and Challenges. Clin Transl Sci 2020;13:665-74. 10.1111/cts.12756 - DOI - PMC - PubMed
    1. Kim J, Kim S, Cho HM, Chang JH, Kim SY. Data sharing policies of journals in life, health, and physical sciences indexed in Journal Citation Reports. PeerJ 2020;8:e9924. 10.7717/peerj.9924 - DOI - PMC - PubMed
    1. Hamilton DG, Fraser H, Hoekstra R, Fidler F. Journal policies and editors’ opinions on peer review. Elife 2020;9:e62529. 10.7554/eLife.62529 - DOI - PMC - PubMed
    1. Resnik DB, Morales M, Landrum R, et al. . Effect of impact factor and discipline on journal data sharing policies. Account Res 2019;26:139-56. 10.1080/08989621.2019.1591277 - DOI - PMC - PubMed
    1. DeVito NJ, French L, Goldacre B. Noncommercial Funders’ Policies on Trial Registration, Access to Summary Results, and Individual Patient Data Availability. JAMA 2018;319:1721-3. 10.1001/jama.2018.2841 - DOI - PMC - PubMed