Sensitivity analysis for the interactive effects of internal bias and publication bias in meta-analyses
- PMID: 37743567
- PMCID: PMC11164126
- DOI: 10.1002/jrsm.1667
Sensitivity analysis for the interactive effects of internal bias and publication bias in meta-analyses
Abstract
Meta-analyses can be compromised by studies' internal biases (e.g., confounding in nonrandomized studies) as well as publication bias. These biases often operate nonadditively: publication bias that favors significant, positive results selects indirectly for studies with more internal bias. We propose sensitivity analyses that address two questions: (1) "For a given severity of internal bias across studies and of publication bias, how much could the results change?"; and (2) "For a given severity of publication bias, how severe would internal bias have to be, hypothetically, to attenuate the results to the null or by a given amount?" These methods consider the average internal bias across studies, obviating specifying the bias in each study individually. The analyst can assume that internal bias affects all studies, or alternatively that it only affects a known subset (e.g., nonrandomized studies). The internal bias can be of unknown origin or, for certain types of bias in causal estimates, can be bounded analytically. The analyst can specify the severity of publication bias or, alternatively, consider a "worst-case" form of publication bias. Robust estimation methods accommodate non-normal effects, small meta-analyses, and clustered estimates. As we illustrate by re-analyzing published meta-analyses, the methods can provide insights that are not captured by simply considering each bias in turn. An R package implementing the methods is available (multibiasmeta).
Keywords: bias analysis; file drawer; internal validity; selective reporting.
© 2023 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Conflict of interest statement
CONFLICT OF INTEREST STATEMENT
The authors declare no conflict of interest.
Figures


Similar articles
-
P-hacking in meta-analyses: A formalization and new meta-analytic methods.Res Synth Methods. 2024 May;15(3):483-499. doi: 10.1002/jrsm.1701. Epub 2024 Jan 25. Res Synth Methods. 2024. PMID: 38273211 Free PMC article.
-
Sensitivity analysis for publication bias in meta-analyses.J R Stat Soc Ser C Appl Stat. 2020 Nov;69(5):1091-1119. doi: 10.1111/rssc.12440. Epub 2020 Aug 28. J R Stat Soc Ser C Appl Stat. 2020. PMID: 33132447 Free PMC article.
-
Estimating publication bias in meta-analyses of peer-reviewed studies: A meta-meta-analysis across disciplines and journal tiers.Res Synth Methods. 2021 Mar;12(2):176-191. doi: 10.1002/jrsm.1464. Epub 2020 Nov 14. Res Synth Methods. 2021. PMID: 33108053 Free PMC article.
-
Methods to Address Confounding and Other Biases in Meta-Analyses: Review and Recommendations.Annu Rev Public Health. 2022 Apr 5;43:19-35. doi: 10.1146/annurev-publhealth-051920-114020. Epub 2021 Sep 17. Annu Rev Public Health. 2022. PMID: 34535060 Free PMC article. Review.
-
Modelling publication bias in meta-analysis: a review.Stat Methods Med Res. 2000 Oct;9(5):421-45. doi: 10.1177/096228020000900503. Stat Methods Med Res. 2000. PMID: 11191259 Review.
Cited by
-
P-hacking in meta-analyses: A formalization and new meta-analytic methods.Res Synth Methods. 2024 May;15(3):483-499. doi: 10.1002/jrsm.1701. Epub 2024 Jan 25. Res Synth Methods. 2024. PMID: 38273211 Free PMC article.
-
Effects of Resistance Training on Executive Functions of Cognitively Healthy Older Adults: A Systematic Review and Meta-Analysis Protocol.Healthcare (Basel). 2025 Jan 16;13(2):165. doi: 10.3390/healthcare13020165. Healthcare (Basel). 2025. PMID: 39857192 Free PMC article.
References
-
- Grossman DC, Bibbins-Domingo K, Curry SJ, et al. Screening for obesity in children and adolescents: US preventive services task force recommendation statement. JAMA. 2017;317(23): 2417–2426. - PubMed
-
- Shaneyfelt T Pyramids are guides not rules: the evolution of the evidence pyramid. BMJ Evid Based Med. 2016;21(4): 121–122. - PubMed
-
- Button KS, Ioannidis JPA, Mokrysz C, et al. Power failure: why small sample size undermines the reliability of neuroscience. Nat Rev Neurosci. 2013;14(5):365. - PubMed
-
- de Vrieze J. The metawars. Science. 2018;361:1184–1188. - PubMed
-
- Nelson LD, Simmons J, Simonsohn U. Psychology’s renaissance. Annu Rev Psychol. 2018;69:511–534. - PubMed
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Research Materials