The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study
- PMID: 34057072
- PMCID: PMC8204237
- DOI: 10.2196/24418
The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study
Abstract
Background: Systematic reviews (SRs) are considered the highest level of evidence to answer research questions; however, they are time and resource intensive.
Objective: When comparing SR tasks done manually, using standard methods, versus those same SR tasks done using automated tools, (1) what is the difference in time to complete the SR task and (2) what is the impact on the error rate of the SR task?
Methods: A case study compared specific tasks done during the conduct of an SR on prebiotic, probiotic, and synbiotic supplementation in chronic kidney disease. Two participants (manual team) conducted the SR using current methods, comprising a total of 16 tasks. Another two participants (automation team) conducted the tasks where a systematic review automation (SRA) tool was available, comprising of a total of six tasks. The time taken and error rate of the six tasks that were completed by both teams were compared.
Results: The approximate time for the manual team to produce a draft of the background, methods, and results sections of the SR was 126 hours. For the six tasks in which times were compared, the manual team spent 2493 minutes (42 hours) on the tasks, compared to 708 minutes (12 hours) spent by the automation team. The manual team had a higher error rate in two of the six tasks-regarding Task 5: Run the systematic search, the manual team made eight errors versus three errors made by the automation team; regarding Task 12: Assess the risk of bias, 25 assessments differed from a reference standard for the manual team compared to 20 differences for the automation team. The manual team had a lower error rate in one of the six tasks-regarding Task 6: Deduplicate search results, the manual team removed one unique study and missed zero duplicates versus the automation team who removed two unique studies and missed seven duplicates. Error rates were similar for the two remaining compared tasks-regarding Task 7: Screen the titles and abstracts and Task 9: Screen the full text, zero relevant studies were excluded by both teams. One task could not be compared between groups-Task 8: Find the full text.
Conclusions: For the majority of SR tasks where an SRA tool was used, the time required to complete that task was reduced for novice researchers while methodological quality was maintained.
Keywords: automation; case study; comparison study; methods evaluation; systematic reviews; technology assessment.
©Justin Clark, Catherine McFarlane, Gina Cleo, Christiane Ishikawa Ramos, Skye Marshall. Originally published in JMIR Medical Education (https://mededu.jmir.org), 31.05.2021.
Conflict of interest statement
Conflicts of Interest: JC declares that he is a developer of some of the tools used in this study and has won prize money from the Australian Library Information Association to continue development of these tools.
Similar articles
-
A full systematic review was completed in 2 weeks using automation tools: a case study.J Clin Epidemiol. 2020 May;121:81-90. doi: 10.1016/j.jclinepi.2020.01.008. Epub 2020 Jan 28. J Clin Epidemiol. 2020. PMID: 32004673
-
Automation of systematic reviews of biomedical literature: a scoping review of studies indexed in PubMed.Syst Rev. 2024 Jul 8;13(1):174. doi: 10.1186/s13643-024-02592-3. Syst Rev. 2024. PMID: 38978132 Free PMC article.
-
How to conduct systematic reviews more expeditiously?Syst Rev. 2015 Nov 12;4:160. doi: 10.1186/s13643-015-0147-7. Syst Rev. 2015. PMID: 26563648 Free PMC article.
-
Crowdsourcing the Citation Screening Process for Systematic Reviews: Validation Study.J Med Internet Res. 2019 Apr 29;21(4):e12953. doi: 10.2196/12953. J Med Internet Res. 2019. PMID: 31033444 Free PMC article.
-
The 2-week systematic review (2weekSR) method was successfully blind-replicated by another team: a case study.J Clin Epidemiol. 2024 Jan;165:111197. doi: 10.1016/j.jclinepi.2023.10.013. Epub 2023 Oct 23. J Clin Epidemiol. 2024. PMID: 37879542
Cited by
-
How much can we save by applying artificial intelligence in evidence synthesis? Results from a pragmatic review to quantify workload efficiencies and cost savings.Front Pharmacol. 2025 Jan 31;16:1454245. doi: 10.3389/fphar.2025.1454245. eCollection 2025. Front Pharmacol. 2025. PMID: 39959426 Free PMC article.
-
Validation of automated paper screening for esophagectomy systematic review using large language models.PeerJ Comput Sci. 2025 Apr 30;11:e2822. doi: 10.7717/peerj-cs.2822. eCollection 2025. PeerJ Comput Sci. 2025. PMID: 40567772 Free PMC article.
-
An Automated Literature Review Tool (LiteRev) for Streamlining and Accelerating Research Using Natural Language Processing and Machine Learning: Descriptive Performance Evaluation Study.J Med Internet Res. 2023 Sep 15;25:e39736. doi: 10.2196/39736. J Med Internet Res. 2023. PMID: 37713261 Free PMC article.
-
How to make a systematic review live up to its name: perspectives from journal editors.Ann Transl Med. 2023 Jun 30;11(9):325. doi: 10.21037/atm-22-6305. Epub 2023 May 10. Ann Transl Med. 2023. PMID: 37404995 Free PMC article. No abstract available.
-
Machine Learning Methods for Systematic Reviews:: A Rapid Scoping Review.Dela J Public Health. 2023 Nov 30;9(4):40-47. doi: 10.32481/djph.2023.11.008. eCollection 2023 Nov. Dela J Public Health. 2023. PMID: 38173960 Free PMC article.
References
-
- Coleman K, Norris S, Weston A, Grimmer-Somers K, Hillier S, Merlin T, Middleton P, Tooher R, Salisbury J. NHMRC Additional Levels of Evidence and Grades for Recommendations for Developers of Guidelines. Canberra, Australia: National Health and Medical Research Council (NHMRC); 2009. [2021-05-11]. https://www.mja.com.au/sites/default/files/NHMRC.levels.of.evidence.2008....
-
- Higgins JPT, Green S. Cochrane Handbook for Systematic Reviews of Interventions. Version 5.1.0. London, UK: The Cochrane Collaboration; 2011.
-
- Tsertsvadze A, Chen Y, Moher D, Sutcliffe P, McCarthy N. How to conduct systematic reviews more expeditiously? Syst Rev. 2015 Nov 12;4:160. doi: 10.1186/s13643-015-0147-7. https://systematicreviewsjournal.biomedcentral.com/articles/10.1186/s136... - DOI - DOI - PMC - PubMed
-
- Créquit P, Trinquart L, Yavchitz A, Ravaud P. Wasted research when systematic reviews fail to provide a complete and up-to-date evidence synthesis: The example of lung cancer. BMC Med. 2016 Jan 20;14:8. doi: 10.1186/s12916-016-0555-0. https://bmcmedicine.biomedcentral.com/articles/10.1186/s12916-016-0555-0 - DOI - DOI - PMC - PubMed
-
- Borah R, Brown AW, Capers PL, Kaiser KA. Analysis of the time and workers needed to conduct systematic reviews of medical interventions using data from the PROSPERO registry. BMJ Open. 2017 Feb 27;7(2):e012545. doi: 10.1136/bmjopen-2016-012545. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=28242767 - DOI - PMC - PubMed
LinkOut - more resources
Full Text Sources
Other Literature Sources
Research Materials