Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty
- PMID: 36306328
- PMCID: PMC9636921
- DOI: 10.1073/pnas.2203150119
Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty
Erratum in
-
Correction for Breznau et al., Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty.Proc Natl Acad Sci U S A. 2024 Jun 25;121(26):e2410677121. doi: 10.1073/pnas.2410677121. Epub 2024 Jun 20. Proc Natl Acad Sci U S A. 2024. PMID: 38900802 Free PMC article. No abstract available.
Abstract
This study explores how researchers' analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens to emphasize the idiosyncrasy of conscious and unconscious decisions that researchers make during data analysis. We coordinated 161 researchers in 73 research teams and observed their research decisions as they used the same data to independently test the same prominent social science hypothesis: that greater immigration reduces support for social policies among the public. In this typical case of social science research, research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions. Researchers' expertise, prior beliefs, and expectations barely predict the wide variation in research outcomes. More than 95% of the total variance in numerical results remains unexplained even after qualitative coding of all identifiable decisions in each team's workflow. This reveals a universe of uncertainty that remains hidden when considering a single study in isolation. The idiosyncratic nature of how researchers' results and conclusions varied is a previously underappreciated explanation for why many scientific hypotheses remain contested. These results call for greater epistemic humility and clarity in reporting scientific findings.
Keywords: analytical flexibility; immigration and policy preferences; many analysts; metascience; researcher degrees of freedom.
Conflict of interest statement
The authors declare no competing interest.
Figures



Comment in
-
A universe of uncertainty hiding in plain sight.Proc Natl Acad Sci U S A. 2023 Jan 10;120(2):e2218530120. doi: 10.1073/pnas.2218530120. Epub 2023 Jan 3. Proc Natl Acad Sci U S A. 2023. PMID: 36595682 Free PMC article. No abstract available.
-
Variation across analysts in statistical significance, yet consistently small effect sizes.Proc Natl Acad Sci U S A. 2023 Jan 17;120(3):e2218957120. doi: 10.1073/pnas.2218957120. Epub 2023 Jan 9. Proc Natl Acad Sci U S A. 2023. PMID: 36623183 Free PMC article. No abstract available.
References
-
- Solomon M., Social Empiricism (MIT Press, 2007).
-
- Oreskes N., Why Trust Science? (Princeton University Press, 2019). - PubMed
-
- Camerer C. F., et al. , Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nat. Hum. Behav. 2, 637–644 (2018). - PubMed
-
- Open Science Collaboration, PSYCHOLOGY. Estimating the reproducibility of psychological science. Science 349, aac4716 (2015). - PubMed
-
- Ritchie S., Science Fictions: How Fraud, Bias, Negligence, and Hype Undermine the Search for Truth (Metropolitan Books, 2020).
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Miscellaneous