An explainable-by-design end-to-end AI framework based on prototypical part learning for lesion detection and classification in Digital Breast Tomosynthesis images
- PMID: 40599244
- PMCID: PMC12212108
- DOI: 10.1016/j.csbj.2025.06.008
An explainable-by-design end-to-end AI framework based on prototypical part learning for lesion detection and classification in Digital Breast Tomosynthesis images
Abstract
Background and objective: Breast cancer is the most common cancer among women worldwide, making early detection through breast screening crucial for improving patient outcomes. Digital Breast Tomosynthesis (DBT) is an advanced radiographic technique that enhances clarity over traditional mammography by compiling multiple X-ray images into a 3D reconstruction, thereby improving cancer detection rates. However, the large data volume of DBT poses a challenge for timely analysis. This study aims to introduce a transparent AI system that not only provides a prediction but also an explanation of that prediction, expediting the analysis of DBT scans while ensuring interpretability.
Methods: The study employs a two-stage deep learning process. The first stage uses state-of-the-art Neural Network (NN) models, specifically YOLOv5 and YOLOv8, to detect lesions within the scans. An ensemble method is also explored to enhance detection capabilities. The second stage involves classifying the identified lesions using ProtoPNet, an inherently transparent NN that leverages prototypical part learning to distinguish between benign and cancerous lesions. The system facilitates clear interpretability in decision-making, which is crucial for medical diagnostics.
Results: The performance of the AI system demonstrates competitive metric results for both detection and classification tasks (a recall of 0.76 and an accuracy of 0.70, respectively). The evaluation metrics, together with the validation by expert radiologists through clinical feedback, highlight the potential of the system for future clinical relevance. Despite challenges such as dataset limitations and the need for more accurate ground truth annotations, which limit the final values of the metrics, the approach shows significant advancement in applying AI to DBT scans.
Conclusions: This study contributes to the growing field of AI in breast cancer screening by emphasizing the need for systems that are not only accurate but also transparent and interpretable. The proposed AI system marks a significant step forward in the timely and accurate analysis of DBT scans, with potential implications for improving early breast cancer detection and patient outcomes.
Keywords: Ante-hoc explainability; DBT; Deep learning; Lesion classification; Lesion detection; ProtoPNet; XAI.
© 2025 The Authors. Published by Elsevier B.V. on behalf of Research Network of Computational and Structural Biotechnology.
Conflict of interest statement
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Figures













References
-
- Arrieta A.B., Díaz-Rodríguez N., Del Ser J., Bennetot A., Tabik S., Barbado A., et al. Explainable artificial intelligence (xai): concepts, taxonomies, opportunities and challenges toward responsible ai. Inf Fusion. 2020;58:82–115.
-
- Bai J., Jin A., Jin A., Wang T., Yang C., Nabavi S. Proceedings of the 13th ACM international conference on bioinformatics, computational biology and health informatics. 2022. Applying graph convolution neural network in digital breast tomosynthesis for cancer classification; pp. 1–10.
-
- Barnett A.J., Schwartz F.R., Tao C., Chen C., Ren Y., Lo J.Y., et al. A case-based interpretable deep learning model for classification of mass lesions in digital mammography. Nat Mach Intell. 2021;3:1061–1070.
-
- Ben-Cohen A., Klang E., Amitai M.M., Goldberger J., Greenspan H. 2018 IEEE 15th international symposium on biomedical imaging (ISBI 2018) IEEE; 2018. Anatomical data augmentation for cnn based pixel-wise classification; pp. 1096–1099.
LinkOut - more resources
Full Text Sources