Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Nov;32(11):7998-8007.
doi: 10.1007/s00330-022-08784-6. Epub 2022 Apr 14.

Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE)

Affiliations

Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE)

Brendan S Kelly et al. Eur Radiol. 2022 Nov.

Erratum in

Abstract

Objective: There has been a large amount of research in the field of artificial intelligence (AI) as applied to clinical radiology. However, these studies vary in design and quality and systematic reviews of the entire field are lacking.This systematic review aimed to identify all papers that used deep learning in radiology to survey the literature and to evaluate their methods. We aimed to identify the key questions being addressed in the literature and to identify the most effective methods employed.

Methods: We followed the PRISMA guidelines and performed a systematic review of studies of AI in radiology published from 2015 to 2019. Our published protocol was prospectively registered.

Results: Our search yielded 11,083 results. Seven hundred sixty-seven full texts were reviewed, and 535 articles were included. Ninety-eight percent were retrospective cohort studies. The median number of patients included was 460. Most studies involved MRI (37%). Neuroradiology was the most common subspecialty. Eighty-eight percent used supervised learning. The majority of studies undertook a segmentation task (39%). Performance comparison was with a state-of-the-art model in 37%. The most used established architecture was UNet (14%). The median performance for the most utilised evaluation metrics was Dice of 0.89 (range .49-.99), AUC of 0.903 (range 1.00-0.61) and Accuracy of 89.4 (range 70.2-100). Of the 77 studies that externally validated their results and allowed for direct comparison, performance on average decreased by 6% at external validation (range increase of 4% to decrease 44%).

Conclusion: This systematic review has surveyed the major advances in AI as applied to clinical radiology.

Key points: • While there are many papers reporting expert-level results by using deep learning in radiology, most apply only a narrow range of techniques to a narrow selection of use cases. • The literature is dominated by retrospective cohort studies with limited external validation with high potential for bias. • The recent advent of AI extensions to systematic reporting guidelines and prospective trial registration along with a focus on external validation and explanations show potential for translation of the hype surrounding AI from code to clinic.

Keywords: Artificial Intelligence; Methodology; Radiology; Systematic reviews.

PubMed Disclaimer

Conflict of interest statement

The authors of this manuscript declare no relationships with any companies, whose products or services may be related to the subject matter of the article.

Figures

Fig. 1
Fig. 1
The PRISMA flow diagram of papers included in our review
Fig. 2
Fig. 2
Radiology artificial intelligence articles by clinical area and year
Fig. 3
Fig. 3
Radiology Artificial Intelligence Articles by modality and year. Modality investigated by year. Colour denotes modality

Comment in

References

    1. Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. Commun Acm. 2017;60:84–90. doi: 10.1145/3065386. - DOI
    1. Thrall JH, Li X, Li Q, et al. Artificial intelligence and machine learning in radiology: opportunities, challenges, pitfalls, and criteria for success. J Am Coll Radiol. 2018;15:504–508. doi: 10.1016/j.jacr.2017.12.026. - DOI - PubMed
    1. Rimmer A. Radiologist shortage leaves patient care at risk, warns royal college. BMJ. 2017;359:j4683. doi: 10.1136/bmj.j4683. - DOI - PubMed
    1. Bluemke DA, Moy L, Bredella MA, et al. Assessing radiology research on artificial intelligence: a brief guide for authors, reviewers, and readers—from the Radiology Editorial Board. Radiology. 2019;294:192515. doi: 10.1148/radiol.2019192515. - DOI - PubMed
    1. Kahn CE., Jr Artificial intelligence, real radiology. Radiology Artif Intell. 2019;1:e184001. doi: 10.1148/ryai.2019184001. - DOI - PMC - PubMed

Publication types

LinkOut - more resources