Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Sep 4;14(17):1957.
doi: 10.3390/diagnostics14171957.

Pre-Reconstruction Processing with the Cycle-Consist Generative Adversarial Network Combined with Attention Gate to Improve Image Quality in Digital Breast Tomosynthesis

Affiliations

Pre-Reconstruction Processing with the Cycle-Consist Generative Adversarial Network Combined with Attention Gate to Improve Image Quality in Digital Breast Tomosynthesis

Tsutomu Gomi et al. Diagnostics (Basel). .

Abstract

The current study proposed and evaluated "residual squeeze and excitation attention gate" (rSEAG), a novel network that can improve image quality by reducing distortion attributed to artifacts. This method was established by modifying the Cycle Generative Adversarial Network (cycleGAN)-based generator network using projection data for pre-reconstruction processing in digital breast tomosynthesis. Residual squeeze and excitation were installed in the bridge of the generator network, and the attention gate was installed in the skip connection between the encoder and decoder. Based on the radiation dose index (exposure index and division index) incident on the detector, the cases approved by the ethics committee and used for the study were classified as reference (675 projection images) and object (675 projection images). For the cases, unsupervised data containing a mixture of cases with and without masses were used. The cases were trained using cycleGAN with rSEAG and the conventional networks (ResUNet and U-Net). For testing, predictive processing was performed on cases (60 projection images) that were not used for learning. Images were generated using filtered backprojection reconstruction (kernel: Ramachandran and Lakshminarayanan) from projection data for testing data and without pre-reconstruction processing data (evaluation: in-focus plane). The distortion was evaluated using perception-based image quality evaluation (PIQE) analysis, texture analysis (feature: "Homogeneity" and "Contrast"), and a statistical model with a Gumbel distribution. PIQE has a low rSEAG value. Texture analysis showed that rSEAG and a network without cycleGAN were similar in terms of the "Contrast" feature. In dense breasts, ResUNet had the lowest "Contrast" feature and U-Net had differences between cases. The maximal variations in the Gumbel plot, rSEAG reduced the high-frequency ripple artifacts. In this study, rSEAG could improve distortion and reduce ripple artifacts.

Keywords: attention gate; breast tomosynthesis; cycle-consistent generative adversarial networks; image quality improvement.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Network architectures for an additive attention gate (AG). AG diagram of the rSEAG network elements.
Figure 2
Figure 2
Network architectures for residual squeeze and excitation attention gate network (rSEAG). Concept of the rSEAG for digital breast tomosynthesis.
Figure 3
Figure 3
Network architectures for residual and squeeze and excitation (SE). Residual and SE diagram of the rSEAG network elements.
Figure 4
Figure 4
Optimization results for parameter (epochs) determination for cycle-consistent generative adversarial network pre-reconstruction processing (generator network, residual squeeze and excitation attention gate network [rSEAG]) of different testing cases. (a) case_1, (b) case_2, (c) case_3, and (d) case_4. The horizontal axis shows the interval between epochs.
Figure 5
Figure 5
Optimization results for parameter (epoch) determination for cycle-consistent generative adversarial network pre-reconstruction processing (generator network, ResUNet) of different cases. (a) case_1, (b) case_2, (c) case_3, and (d) case_4. The horizontal axis shows the interval between epochs.
Figure 5
Figure 5
Optimization results for parameter (epoch) determination for cycle-consistent generative adversarial network pre-reconstruction processing (generator network, ResUNet) of different cases. (a) case_1, (b) case_2, (c) case_3, and (d) case_4. The horizontal axis shows the interval between epochs.
Figure 6
Figure 6
Optimization results for parameter (epoch) determination for cycle-consistent generative adversarial network pre-reconstruction processing (generator network, U-Net) of different cases. (a) case_1, (b) case_2, (c) case_3, and (d) case_4. The horizontal axis shows the interval between epochs.
Figure 7
Figure 7
Comparisons between cycle-consistent generative adversarial network (cycleGAN) pre-reconstruction processing with and without relative generator networks (network without cycleGAN, residual squeeze and excitation attention gate network [rSEAG], ResUNet, and U-Net) hole and zoomed images in the in-focus plane (case_1). The all-display window range is 0–0.02. (a) Without cycleGAN, (b) rSEAG, (c) resUNet, and (d) U-Net.
Figure 8
Figure 8
Comparisons between cycle-consistent generative adversarial network (cycleGAN) pre-reconstruction processing with and without relative generator networks (network without cycleGAN, residual squeeze and excitation attention gate network [rSEAG], ResUNet, and U-Net) hole and zoomed images in the in-focus plane (case_2). The all-display window range is 0–0.02. (a) Without cycleGAN, (b) rSEAG, (c) resUNet, and (d) U-Net.
Figure 9
Figure 9
Comparisons between cycle-consistent generative adversarial network (cycleGAN) pre-reconstruction processing with and without relative generator networks (network without cycleGAN, residual squeeze and excitation attention gate network [rSEAG], ResUNet, and U-Net) hole and zoomed images in the in-focus plane (case_3). The all-display window range is 0–0.02. (a) Without cycleGAN, (b) rSEAG, (c) resUNet, and (d) U-Net.
Figure 10
Figure 10
Comparisons between cycle-consistent generative adversarial network (cycleGAN) pre-reconstruction processing with and without relative generator networks (network without cycleGAN, residual squeeze and excitation attention gate network [rSEAG], ResUNet, and U-Net) hole and zoomed images in the in-focus plane (case_4). The all-display window range is 0–0.02. (a) Without cycleGAN, (b) rSEAG, (c) resUNet, and (d) U-Net.
Figure 11
Figure 11
Results of the perception-based image quality evaluation (PIQE) versus each testing case. Plots of the PIQE for the in-focus plane at the cycle-consistent generative adversarial network (cycleGAN) pre-reconstruction processing with and without relative generator networks (network without cycleGAN, residual squeeze and excitation attention gate network [rSEAG], ResUNet, and U-Net) and each testing case. (Black: without cycle-GAN, Red: rSEAG, Green: ResUNet, Yellow: U-Net).
Figure 12
Figure 12
Results of the texture analysis (feature: “Homogeneity” and “Contrast”) versus each testing case. Plots of the “Homogeneity” and “Contrast” for the in-focus plane at the cycle-consistent generative adversarial network (cycleGAN) pre-reconstruction processing with and without relative generator networks (network without cycleGAN, residual squeeze and excitation attention gate network [rSEAG], ResUNet, and U-Net) and each testing case. (Black: without cycle-GAN, Red: rSEAG, Green: ResUNet, Yellow: U-Net).
Figure 13
Figure 13
The largest variations extracted from 29 pixel−value profiles are plotted. The relatively large variations in pixel values were attributed to high-frequency ripple artifacts at different cases. (a) case_1, (b) case_2, (c) case_3, and (d) case_4.

References

    1. Barth R.J., Jr., Gibson G.R., Carney P.A., Mott L.A., Becher R.D., Poplack S.P. Detection of breast cancer on screening mammography allows patients to be treated with less-toxic therapy. AJR Am. J. Roentgenol. 2005;184:324–329. doi: 10.2214/ajr.184.1.01840324. - DOI - PubMed
    1. Gao Y., Moy L., Heller S.L. Digital Breast Tomosynthesis: Update on Technology, Evidence, and Clinical Practice. Radiographics. 2021;41:321–337. doi: 10.1148/rg.2021200101. - DOI - PMC - PubMed
    1. Samala R.K., Chan H.P., Hadjiiski L.M., Helvie M.A., Richter C., Cha K. Evolutionary pruning of transfer learned deep convolutional neural network for breast cancer diagnosis in digital breast tomosynthesis. Phys. Med. Biol. 2018;63:095005. doi: 10.1088/1361-6560/aabb5b. - DOI - PMC - PubMed
    1. Samala R.K., Heang-Ping C., Hadjiiski L., Helvie M.A., Richter C.D., Cha K.H. Breast Cancer Diagnosis in Digital Breast Tomosynthesis: Effects of Training Sample Size on Multi-Stage Transfer Learning Using Deep Neural Nets. IEEE Trans. Med. Imaging. 2019;38:686–696. doi: 10.1109/TMI.2018.2870343. - DOI - PMC - PubMed
    1. Pinto M.C., Rodriguez-Ruiz A., Pedersen K., Hofvind S., Wicklein J., Kappler S., Mann R.M., Sechopoulos I. Impact of Artificial Intelligence Decision Support Using Deep Learning on Breast Cancer Screening Interpretation with Single-View Wide-Angle Digital Breast Tomosynthesis. Radiology. 2021;300:529–536. doi: 10.1148/radiol.2021204432. - DOI - PubMed

LinkOut - more resources