Improving Breast Cancer Diagnosis in Ultrasound Images Using Deep Learning with Feature Fusion and Attention Mechanism
- PMID: 40436710
- DOI: 10.1016/j.acra.2025.05.007
Improving Breast Cancer Diagnosis in Ultrasound Images Using Deep Learning with Feature Fusion and Attention Mechanism
Abstract
Rationale and objectives: Early detection of malignant lesions in ultrasound images is crucial for effective cancer diagnosis and treatment. While traditional methods rely on radiologists, deep learning models can improve accuracy, reduce errors, and enhance efficiency. This study explores the application of a deep learning model for classifying benign and malignant lesions, focusing on its performance and interpretability.
Materials and methods: In this study, we proposed a feature fusion-based deep learning model for classifying benign and malignant lesions in ultrasound images. The model leverages advanced architectures such as MobileNetV2 and DenseNet121, enhanced with feature fusion and attention mechanisms to boost classification accuracy. The clinical dataset comprises 2171 images collected from 1758 patients between December 2020 and May 2024. Additionally, we utilized the publicly available BUSI dataset, consisting of 780 images from female patients aged 25 to 75, collected in 2018. To enhance interpretability, we applied Grad-CAM, Saliency Maps, and shapley additive explanations (SHAP) techniques to explain the model's decision-making. A comparative analysis with radiologists of varying expertise levels is also conducted.
Results: The proposed model exhibited the highest performance, achieving an AUC of 0.9320 on our private dataset and an area under the curve (AUC) of 0.9834 on the public dataset, significantly outperforming traditional deep convolutional neural network models. It also exceeded the diagnostic performance of radiologists, showcasing its potential as a reliable tool for medical image classification. The model's success can be attributed to its incorporation of advanced architectures, feature fusion, and attention mechanisms. The model's decision-making process was further clarified using interpretability techniques like Grad-CAM, Saliency Maps, and SHAP, offering insights into its ability to focus on relevant image features for accurate classification.
Conclusion: The proposed deep learning model offers superior accuracy in classifying benign and malignant lesions in ultrasound images, outperforming traditional models and radiologists. Its strong performance, coupled with interpretability techniques, demonstrates its potential as a reliable and efficient tool for medical diagnostics.
Data availability: The datasets generated and analyzed during the current study are not publicly available due to the nature of this research and participants of this study, but may be available from the corresponding author on reasonable request.
Keywords: Attention mechanisms; Deep learning; Feature fusion; Medical diagnostics; Ultrasound imaging.
Copyright © 2025. Published by Elsevier Inc.
Conflict of interest statement
Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Similar articles
-
An explainable AI-driven deep neural network for accurate breast cancer detection from histopathological and ultrasound images.Sci Rep. 2025 May 20;15(1):17531. doi: 10.1038/s41598-025-97718-5. Sci Rep. 2025. PMID: 40394112 Free PMC article.
-
Towards automated and reliable lung cancer detection in histopathological images using DY-FSPAN: A feature-summarized pyramidal attention network for explainable AI.Comput Biol Chem. 2025 Oct;118:108500. doi: 10.1016/j.compbiolchem.2025.108500. Epub 2025 May 10. Comput Biol Chem. 2025. PMID: 40381571
-
Deep learning-based MVIT-MLKA model for accurate classification of pancreatic lesions: a multicenter retrospective cohort study.Radiol Med. 2025 Apr;130(4):508-523. doi: 10.1007/s11547-025-01949-5. Epub 2025 Jan 20. Radiol Med. 2025. PMID: 39832039
-
Integrating intratumoral and peritumoral radiomics with deep transfer learning for DCE-MRI breast lesion differentiation: A multicenter study comparing performance with radiologists.Eur J Radiol. 2024 Aug;177:111556. doi: 10.1016/j.ejrad.2024.111556. Epub 2024 Jun 9. Eur J Radiol. 2024. PMID: 38875748
-
Deep learning diagnostic performance and visual insights in differentiating benign and malignant thyroid nodules on ultrasound images.Exp Biol Med (Maywood). 2023 Dec;248(24):2538-2546. doi: 10.1177/15353702231220664. Epub 2024 Jan 26. Exp Biol Med (Maywood). 2023. PMID: 38279511 Free PMC article.
LinkOut - more resources
Full Text Sources