Challenges and suggestions for defining replication "success" when effects may be heterogeneous: Comment on Hedges and Schauer (2019)
- PMID: 31580141
- PMCID: PMC6779319
- DOI: 10.1037/met0000223
Challenges and suggestions for defining replication "success" when effects may be heterogeneous: Comment on Hedges and Schauer (2019)
Abstract
Psychological scientists are now trying to replicate published research from scratch to confirm the findings. In an increasingly widespread replication study design, each of several collaborating sites (such as universities) independently tries to replicate an original study, and the results are synthesized across sites. Hedges and Schauer (2019) proposed statistical analyses for these replication projects; their analyses focus on assessing the extent to which results differ across the replication sites, by testing for heterogeneity among a set of replication studies, while excluding the original study. We agree with their premises regarding the limitations of existing analysis methods and regarding the importance of accounting for heterogeneity among the replications. This objective may be interesting in its own right. However, we argue that by focusing only on whether the replication studies have similar effect sizes to one another, these analyses are not particularly appropriate for assessing whether the replications in fact support the scientific effect under investigation or for assessing the power of multisite replication projects. We reanalyze Hedges and Schauer's (2019) example dataset using alternative metrics of replication success that directly address these objectives. We reach a more optimistic conclusion regarding replication success than they did, illustrating that the alternative metrics can lead to quite different conclusions from those of Hedges and Schauer (2019). (PsycINFO Database Record (c) 2019 APA, all rights reserved).
Figures
Comment in
-
Consistency of effects is important in replication: Rejoinder to Mathur and VanderWeele (2019).Psychol Methods. 2019 Oct;24(5):576-577. doi: 10.1037/met0000237. Psychol Methods. 2019. PMID: 31580142
Comment on
-
Statistical analyses for studying replication: Meta-analytic perspectives.Psychol Methods. 2019 Oct;24(5):557-570. doi: 10.1037/met0000189. Epub 2018 Aug 2. Psychol Methods. 2019. PMID: 30070547
References
-
- Anderson SF, & Maxwell SE (2016). There’s more than one way to conduct a replication study: Beyond statistical significance. Psychological Methods, 21(1), 1. - PubMed
-
- Crandall CS, & Sherman JW (2016). On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 66, 93–99.
-
- Gilbert DT, King G, Pettigrew S, & Wilson TD (2016). Comment on “Estimating the reproducibility of psychological science”. Science, 351(6277), 1037–103 - PubMed
