Web-Based Software Tools for Systematic Literature Review in Medicine: Systematic Search and Feature Analysis
- PMID: 35499859
- PMCID: PMC9112080
- DOI: 10.2196/33219
Web-Based Software Tools for Systematic Literature Review in Medicine: Systematic Search and Feature Analysis
Erratum in
-
Correction: Web-Based Software Tools for Systematic Literature Review in Medicine: Systematic Search and Feature Analysis.JMIR Med Inform. 2022 Nov 23;10(11):e43520. doi: 10.2196/43520. JMIR Med Inform. 2022. PMID: 36417760 Free PMC article.
Abstract
Background: Systematic reviews (SRs) are central to evaluating therapies but have high costs in terms of both time and money. Many software tools exist to assist with SRs, but most tools do not support the full process, and transparency and replicability of SR depend on performing and presenting evidence according to established best practices.
Objective: This study aims to provide a basis for comparing and selecting between web-based software tools that support SR, by conducting a feature-by-feature comparison of SR tools.
Methods: We searched for SR tools by reviewing any such tool listed in the SR Toolbox, previous reviews of SR tools, and qualitative Google searching. We included all SR tools that were currently functional and required no coding, and excluded reference managers, desktop applications, and statistical software. The list of features to assess was populated by combining all features assessed in 4 previous reviews of SR tools; we also added 5 features (manual addition, screening automation, dual extraction, living review, and public outputs) that were independently noted as best practices or enhancements of transparency and replicability. Then, 2 reviewers assigned binary present or absent assessments to all SR tools with respect to all features, and a third reviewer adjudicated all disagreements.
Results: Of the 53 SR tools found, 55% (29/53) were excluded, leaving 45% (24/53) for assessment. In total, 30 features were assessed across 6 classes, and the interobserver agreement was 86.46%. Giotto Compliance (27/30, 90%), DistillerSR (26/30, 87%), and Nested Knowledge (26/30, 87%) support the most features, followed by EPPI-Reviewer Web (25/30, 83%), LitStream (23/30, 77%), JBI SUMARI (21/30, 70%), and SRDB.PRO (VTS Software) (21/30, 70%). Fewer than half of all the features assessed are supported by 7 tools: RobotAnalyst (National Centre for Text Mining), SRDR (Agency for Healthcare Research and Quality), SyRF (Systematic Review Facility), Data Abstraction Assistant (Center for Evidence Synthesis in Health), SR Accelerator (Institute for Evidence-Based Healthcare), RobotReviewer (RobotReviewer), and COVID-NMA (COVID-NMA). Notably, of the 24 tools, only 10 (42%) support direct search, only 7 (29%) offer dual extraction, and only 13 (54%) offer living/updatable reviews.
Conclusions: DistillerSR, Nested Knowledge, and EPPI-Reviewer Web each offer a high density of SR-focused web-based tools. By transparent comparison and discussion regarding SR tool functionality, the medical community can both choose among existing software offerings and note the areas of growth needed, most notably in the support of living reviews.
Keywords: feature analysis; software tools; systematic reviews.
©Kathryn Cowie, Asad Rahmatullah, Nicole Hardy, Karl Holub, Kevin Kallmes. Originally published in JMIR Medical Informatics (https://medinform.jmir.org), 02.05.2022.
Conflict of interest statement
Conflicts of Interest: KC, NH, and KH work for and hold equity in Nested Knowledge, which provides a software application included in this assessment. AR worked for Nested Knowledge. KL works for and holds equity in Nested Knowledge, Inc, and holds equity in Superior Medical Experts, Inc. KK works for and holds equity in Nested Knowledge, and holds equity in Superior Medical Experts.
Figures
References
-
- Burns PB, Rohrich RJ, Chung KC. The levels of evidence and their role in evidence-based medicine. Plast Reconstr Surg. 2011 Jul;128(1):305–10. doi: 10.1097/PRS.0b013e318219c171. http://europepmc.org/abstract/MED/21701348 00006534-201107000-00046 - DOI - PMC - PubMed
-
- Borah R, Brown AW, Capers PL, Kaiser KA. Analysis of the time and workers needed to conduct systematic reviews of medical interventions using data from the PROSPERO registry. BMJ Open. 2017 Feb 27;7(2):e012545. doi: 10.1136/bmjopen-2016-012545. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=28242767 bmjopen-2016-012545 - DOI - PMC - PubMed
-
- Michelson M, Reuter K. The significant cost of systematic reviews and meta-analyses: a call for greater involvement of machine learning to assess the promise of clinical trials. Contemp Clin Trials Commun. 2019 Dec;16:100443. doi: 10.1016/j.conctc.2019.100443. https://linkinghub.elsevier.com/retrieve/pii/S2451-8654(19)30205-4 S2451-8654(19)30205-4 - DOI - PMC - PubMed
-
- Créquit P, Trinquart L, Yavchitz A, Ravaud P. Wasted research when systematic reviews fail to provide a complete and up-to-date evidence synthesis: the example of lung cancer. BMC Med. 2016 Jan 20;14:8. doi: 10.1186/s12916-016-0555-0. https://bmcmedicine.biomedcentral.com/articles/10.1186/s12916-016-0555-0 10.1186/s12916-016-0555-0 - DOI - DOI - PMC - PubMed
LinkOut - more resources
Full Text Sources
Research Materials
