Can I trust this paper?
- PMID: 40670835
- DOI: 10.3758/s13423-025-02740-3
Can I trust this paper?
Abstract
After a decade of data falsification scandals and replication failures in psychology and related empirical disciplines, there are urgent calls for open science and structural reform in the publishing industry. In the meantime, however, researchers need to learn how to recognize tell-tale signs of methodological and conceptual shortcomings that make a published claim suspect. I review four key problems and propose simple ways to detect them. First, the study may be fake; if in doubt, inspect the authors' and journal's profiles and request to see the raw data to check for inconsistencies. Second, there may be too little data; low precision of effect sizes is a clear warning sign of this. Third, the data may not be analyzed correctly; excessive flexibility in data analysis can be deduced from signs of data dredging and convoluted post hoc theorizing in the text, while violations of model assumptions can be detected by examining plots of observed data and model predictions. Fourth, the conclusions may not be justified by the data; common issues are inappropriate acceptance of the null hypothesis, biased meta-analyses, over-generalization over unmodeled variance, hidden confounds, and unspecific theoretical predictions. The main takeaways are to verify that the methodology is robust and to distinguish between what the actual results are and what the authors claim these results mean when citing empirical work. Critical evaluation of published evidence is an essential skill to develop as it can prevent researchers from pursuing unproductive avenues and ensure better trustworthiness of science as a whole.
Keywords: Power; Replication; Research integrity; Statistics.
© 2025. The Author(s).
Conflict of interest statement
Declarations. Ethics approval: Not applicable. Consent to participate: Not applicable. Consent for publication: Not applicable. Competing interests: No competing interests.
References
-
- Abalkina, A., & Bishop, D. (2023). Paper mills: A novel form of publishing malpractice affecting psychology. Meta-Psychology, 7. https://doi.org/10.15626/MP.2022.3422
-
- Abdullah, H. O., Abdalla, B. A., Kakamad, F. H., Ahmed, J. O., Baba, H. O., Hassan, M. N., Bapir, R., Rahim, H. M., Omar, D. A., Kakamad, S. H., et al. (2024). Predatory publishing lists: A review on the ongoing battle against fraudulent actions. Barw Medical Journal, 2(2), 26–30.
-
- Adler, S. J., Röseler, L., & Schöniger, M. K. (2023). A toolbox to evaluate the trustworthiness of published findings. Journal of Business Research, 167, 114189.
-
- Antonakis, J. (2017). On doing better science: From thrill of discovery to policy implications. The Leadership Quarterly, 28(1), 5–21.
Publication types
Grants and funding
LinkOut - more resources
Full Text Sources