A review of explainable and interpretable AI with applications in COVID-19 imaging
- PMID: 34796530
- PMCID: PMC8646613
- DOI: 10.1002/mp.15359
A review of explainable and interpretable AI with applications in COVID-19 imaging
Abstract
The development of medical imaging artificial intelligence (AI) systems for evaluating COVID-19 patients has demonstrated potential for improving clinical decision making and assessing patient outcomes during the recent COVID-19 pandemic. These have been applied to many medical imaging tasks, including disease diagnosis and patient prognosis, as well as augmented other clinical measurements to better inform treatment decisions. Because these systems are used in life-or-death decisions, clinical implementation relies on user trust in the AI output. This has caused many developers to utilize explainability techniques in an attempt to help a user understand when an AI algorithm is likely to succeed as well as which cases may be problematic for automatic assessment, thus increasing the potential for rapid clinical translation. AI application to COVID-19 has been marred with controversy recently. This review discusses several aspects of explainable and interpretable AI as it pertains to the evaluation of COVID-19 disease and it can restore trust in AI application to this disease. This includes the identification of common tasks that are relevant to explainable medical imaging AI, an overview of several modern approaches for producing explainable output as appropriate for a given imaging scenario, a discussion of how to evaluate explainable AI, and recommendations for best practices in explainable/interpretable AI implementation. This review will allow developers of AI systems for COVID-19 to quickly understand the basics of several explainable AI techniques and assist in the selection of an approach that is both appropriate and effective for a given scenario.
Keywords: AI; COVID-19; deep learning; explainability; interpretability.
© 2021 American Association of Physicists in Medicine.
Conflict of interest statement
IEN has served as deputy editor of
Figures






Similar articles
-
Evaluating Explainable Artificial Intelligence (XAI) techniques in chest radiology imaging through a human-centered Lens.PLoS One. 2024 Oct 9;19(10):e0308758. doi: 10.1371/journal.pone.0308758. eCollection 2024. PLoS One. 2024. PMID: 39383147 Free PMC article.
-
Explainable artificial intelligence in emergency medicine: an overview.Clin Exp Emerg Med. 2023 Dec;10(4):354-362. doi: 10.15441/ceem.23.145. Epub 2023 Nov 28. Clin Exp Emerg Med. 2023. PMID: 38012816 Free PMC article.
-
Explainable artificial intelligence and machine learning: novel approaches to face infectious diseases challenges.Ann Med. 2023;55(2):2286336. doi: 10.1080/07853890.2023.2286336. Epub 2023 Nov 27. Ann Med. 2023. PMID: 38010090 Free PMC article.
-
Explainable Artificial Intelligence Methods in Combating Pandemics: A Systematic Review.IEEE Rev Biomed Eng. 2023;16:5-21. doi: 10.1109/RBME.2022.3185953. Epub 2023 Jan 5. IEEE Rev Biomed Eng. 2023. PMID: 35737637
-
An Overview of Explainable AI Studies in the Prediction of Sepsis Onset and Sepsis Mortality.Stud Health Technol Inform. 2024 Aug 22;316:808-812. doi: 10.3233/SHTI240534. Stud Health Technol Inform. 2024. PMID: 39176915 Review.
Cited by
-
MIDRC CRP10 AI interface-an integrated tool for exploring, testing and visualization of AI models.Phys Med Biol. 2023 Mar 23;68(7):10.1088/1361-6560/acb754. doi: 10.1088/1361-6560/acb754. Phys Med Biol. 2023. PMID: 36716497 Free PMC article.
-
Simulating clinical features on chest radiographs for medical image exploration and CNN explainability using a style-based generative adversarial autoencoder.Sci Rep. 2024 Oct 18;14(1):24427. doi: 10.1038/s41598-024-75886-0. Sci Rep. 2024. PMID: 39424900 Free PMC article.
-
Weekly Nowcasting of New COVID-19 Cases Using Past Viral Load Measurements.Viruses. 2022 Jun 28;14(7):1414. doi: 10.3390/v14071414. Viruses. 2022. PMID: 35891394 Free PMC article.
-
Integrating artificial intelligence with smartphone-based imaging for cancer detection in vivo.Biosens Bioelectron. 2025 Mar 1;271:116982. doi: 10.1016/j.bios.2024.116982. Epub 2024 Nov 21. Biosens Bioelectron. 2025. PMID: 39616900 Review.
-
Regulation of AI algorithms for clinical decision support: a personal opinion.Int J Comput Assist Radiol Surg. 2024 Apr;19(4):609-611. doi: 10.1007/s11548-024-03088-0. Epub 2024 Mar 13. Int J Comput Assist Radiol Surg. 2024. PMID: 38478205 No abstract available.
References
-
- Leung MKK, Delong A, Alipanahi B, Frey BJ. Machine learning in genomic medicine: a review of computational problems and data sets. Proc IEEE. 2016;104(1):176‐197.
-
- Fatima M, Pasha M. Survey of machine learning algorithms for disease diagnostic. J Intell Learn Syst Appl. 2017;09(01):1.
-
- Rajkomar A, Dean J, Kohane I. Machine learning in medicine. N Engl J Med. 2019;380(14):1347‐1358. - PubMed
-
- Yanase J, Triantaphyllou E. A systematic survey of computer‐aided diagnosis in medicine: past and present developments. Expert Syst Appl. 2019;138:112821.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Medical