SDAM: A dual attention mechanism for high-quality fusion of infrared and visible images
- PMID: 39316595
- PMCID: PMC11421820
- DOI: 10.1371/journal.pone.0308885
SDAM: A dual attention mechanism for high-quality fusion of infrared and visible images
Abstract
Image fusion of infrared and visible images to obtain high-quality fusion images with prominent infrared targets has important applications in various engineering fields. However, current fusion processes encounter problems such as unclear texture details and imbalanced infrared targets and texture detailed information, which lead to information loss. To address these issues, this paper proposes a method for infrared and visible image fusion based on a specific dual-attention mechanism (SDAM). This method employs an end-to-end network structure, which includes the design of channel attention and spatial attention mechanisms. Through these mechanisms, the method can fully exploit the texture details in the visible images while preserving the salient information in the infrared images. Additionally, an optimized loss function is designed to combine content loss, edge loss, and structure loss to achieve better fusion effects. This approach can fully utilize the texture detailed information of visible images and prominent information in infrared images, while maintaining better brightness and contrast, which improves the visual effect of fusion images. Through conducted ablation experiments and comparative evaluations on public datasets, our research findings demonstrate that the SDAM method exhibits superior performance in both subjective and objective assessments compared to the current state-of-the-art fusion methods.
Copyright: © 2024 Hu et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Conflict of interest statement
The authors have declared that no competing interests exist.
Figures












Similar articles
-
Fusion algorithm of visible and infrared image based on anisotropic diffusion and image enhancement (capitalize only the first word in a title (or heading), the first word in a subtitle (or subheading), and any proper nouns).PLoS One. 2021 Feb 19;16(2):e0245563. doi: 10.1371/journal.pone.0245563. eCollection 2021. PLoS One. 2021. PMID: 33606680 Free PMC article.
-
DRSNFuse: Deep Residual Shrinkage Network for Infrared and Visible Image Fusion.Sensors (Basel). 2022 Jul 8;22(14):5149. doi: 10.3390/s22145149. Sensors (Basel). 2022. PMID: 35890828 Free PMC article.
-
DMCM: Dwo-branch multilevel feature fusion with cross-attention mechanism for infrared and visible image fusion.PLoS One. 2025 Mar 28;20(3):e0318931. doi: 10.1371/journal.pone.0318931. eCollection 2025. PLoS One. 2025. PMID: 40153453 Free PMC article.
-
MEEAFusion: Multi-Scale Edge Enhancement and Joint Attention Mechanism Based Infrared and Visible Image Fusion.Sensors (Basel). 2024 Sep 9;24(17):5860. doi: 10.3390/s24175860. Sensors (Basel). 2024. PMID: 39275771 Free PMC article.
-
Infrared and Visible Image Fusion Technology and Application: A Review.Sensors (Basel). 2023 Jan 4;23(2):599. doi: 10.3390/s23020599. Sensors (Basel). 2023. PMID: 36679396 Free PMC article. Review.
References
-
- Han Z, Zhang C, Feng H, Yue M, Quan K. PFFNET: A Fast Progressive Feature Fusion Network for Detecting Drones in Infrared Images. Drones. 2023;7(7):424. doi: 10.3390/drones7070424 - DOI
-
- Xue Y, Zhang J, Lin Z, Li C, Huo B, Zhang Y. SiamCAF: Complementary Attention Fusion-Based Siamese Network for RGBT Tracking. Remote Sensing. 2023;15(13):3252. doi: 10.3390/rs15133252 - DOI
-
- Li L, Lv M, Jia Z, Jin Q, Liu M, Chen L, et al.. An Effective Infrared and Visible Image Fusion Approach via Rolling Guidance Filtering and Gradient Saliency Map. Remote Sensing. 2023;15(10):2486. doi: 10.3390/rs15102486 - DOI
-
- Wang C, Zang Y, Zhou D, Nie R, Mei J. An interactive deep model combined with Retinex for low-light visible and infrared image fusion. Neural Computing and Applications. 2023; p. 1–19.
MeSH terms
LinkOut - more resources
Full Text Sources