Exploring Differential Diagnosis-Based Explainable AI: A Case Study in Melanoma Detection
- PMID: 40380499
- DOI: 10.3233/SHTI250389
Exploring Differential Diagnosis-Based Explainable AI: A Case Study in Melanoma Detection
Abstract
Melanoma is a significant global health concern, with rising incidence rates and high mortality when diagnosed late. Artificial Intelligence (AI) models, especially models using deep learning techniques, have shown promising results in melanoma detection. However, the complexity of these models often leads to a lack of transparency, making it difficult for clinicians to understand and trust AI-based diagnoses. This paper presents a novel Explainable AI (XAI) method that aligns with differential diagnosis techniques commonly used in clinical settings, providing more comprehensive explanations. The novel XAI method and four commonly used XAI methods were evaluated with intended users with respect to perceived useability and trust. We found that the new method was considered more useful than other methods tested. Notably, the widely used saliency mapping technique received the lowest ratings, performing even worse than providing no explanation at all.
Keywords: Explainable AI; Melanoma detection; differential diagnosis; human-centred evaluation; imaging.
MeSH terms
LinkOut - more resources
Full Text Sources
Medical
