Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Meta-Analysis
. 2023 Mar 28;12(1):56.
doi: 10.1186/s13643-023-02223-3.

Delphi survey on the most promising areas and methods to improve systematic reviews' production and updating

Affiliations
Meta-Analysis

Delphi survey on the most promising areas and methods to improve systematic reviews' production and updating

Mersiha Mahmić-Kaknjo et al. Syst Rev. .

Abstract

Background: Systematic reviews (SRs) are invaluable evidence syntheses, widely used in biomedicine and other scientific areas. Tremendous resources are being spent on the production and updating of SRs. There is a continuous need to automatize the process and use the workforce and resources to make it faster and more efficient.

Methods: Information gathered by previous EVBRES research was used to construct a questionnaire for round 1 which was partly quantitative, partly qualitative. Fifty five experienced SR authors were invited to participate in a Delphi study (DS) designed to identify the most promising areas and methods to improve the efficient production and updating of SRs. Topic questions focused on which areas of SRs are most time/effort/resource intensive and should be prioritized in further research. Data were analysed using NVivo 12 plus, Microsoft Excel 2013 and SPSS. Thematic analysis findings were used on the topics on which agreement was not reached in round 1 in order to prepare the questionnaire for round 2.

Results: Sixty percent (33/55) of the invited participants completed round 1; 44% (24/55) completed round 2. Participants reported average of 13.3 years of experience in conducting SRs (SD 6.8). More than two thirds of the respondents agreed/strongly agreed the following topics should be prioritized: extracting data, literature searching, screening abstracts, obtaining and screening full texts, updating SRs, finding previous SRs, translating non-English studies, synthesizing data, project management, writing the protocol, constructing the search strategy and critically appraising. Participants have not considered following areas as priority: snowballing, GRADE-ing, writing SR, deduplication, formulating SR question, performing meta-analysis.

Conclusions: Data extraction was prioritized by the majority of participants as an area that needs more research/methods development. Quality of available language translating tools has dramatically increased over the years (Google translate, DeepL). The promising new tool for snowballing emerged (Citation Chaser). Automation cannot substitute human judgement where complex decisions are needed (GRADE-ing).

Trial registration: Study protocol was registered at https://osf.io/bp2hu/ .

Keywords: Automation tools; Evidence syntesis; Prioritization.

PubMed Disclaimer

Conflict of interest statement

None of the authors report any financial conflicts of interest with respect to the topic of this manuscript. All authors have a general interest in evidence synthesis methods. Some authors are associated with groups, conferences, and tools focusing on evidence synthesis methods: Barbara Nussbaumer-Streit is co-convenor of the Cochrane Rapid Reviews Methods Group. Raluca Sfetcu is a member of the JBI method group for “Systematic reviews of etiology and risk”. Ana Marušić is funded by the Croatian Science Foundation under Grant agreement No. IP-2019–04-4882.

Figures

Fig. 1
Fig. 1
Demographic characteristics of participants. Since there were no extreme outlier values, mean and standard deviation measures were selected as central tendency and level of dispersion measures.*if respondent's answer was “more than x”, “around x” etc. the value was calculated as x
Fig. 2
Fig. 2
Results of round 1 of the DS. Participants’ view on to what degree the step of the SR production and updating should be prioritizied concerning methods of development and automation
Fig. 3
Fig. 3
Results of round 2 of the DS. Participants’ view on snowballing (development on better tools, need for automation), GRADE-ing (complex task that requires human judgement, standardization, resource use, methodologically developed area), and deduplication (scope for improvement, advancement of automation)
Fig. 4
Fig. 4
The most promising areas and methods to improve the efficient production of SRs thematic map. The reflexive thematic analysis of qualitative data identified 3 main topics and 6 subtopics

References

    1. Bastian H, Glasziou P, Chalmers I. Seventy-five trials and eleven systematic reviews a day: how will we ever keep up? PLoS Med. 2010;7(9):e1000326. doi: 10.1371/journal.pmed.1000326. - DOI - PMC - PubMed
    1. Chalmers I, Hedges LV, Cooper H. A brief history of research synthesis. Eval Health Prof. 2002;25(1):12–37. doi: 10.1177/0163278702025001003. - DOI - PubMed
    1. Brunton G, Stansfield C, Caird J, Thomas J. Finding relevant studies. In: Gough D, Oliver S, Thomas J, editors. An introduction to systematic reviews. 2nd ed. London: Sage Publications Inc; 2017.
    1. Thane P. A critical woman. Barbara Wootton, social science and public policy in the twentieth century. Vol. 23. By Ann Oakley. Twentieth Century British History; 2012.
    1. Gopalakrishnan S, Ganeshkumar P. Systematic reviews and meta-analysis: Understanding the best evidence in primary healthcare. J Fam Med Prim Care. 2013;2(1):9. doi: 10.4103/2249-4863.109934. - DOI - PMC - PubMed

Publication types