Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2024 Dec 18;24(24):8068.
doi: 10.3390/s24248068.

Image Synthesis in Nuclear Medicine Imaging with Deep Learning: A Review

Affiliations
Review

Image Synthesis in Nuclear Medicine Imaging with Deep Learning: A Review

Thanh Dat Le et al. Sensors (Basel). .

Abstract

Nuclear medicine imaging (NMI) is essential for the diagnosis and sensing of various diseases; however, challenges persist regarding image quality and accessibility during NMI-based treatment. This paper reviews the use of deep learning methods for generating synthetic nuclear medicine images, aimed at improving the interpretability and utility of nuclear medicine protocols. We discuss advanced image generation algorithms designed to recover details from low-dose scans, uncover information hidden by specific radiopharmaceutical properties, and enhance the sensing of physiological processes. By analyzing 30 of the newest publications in this field, we explain how deep learning models produce synthetic nuclear medicine images that closely resemble their real counterparts, significantly enhancing diagnostic accuracy when images are acquired at lower doses than the clinical policies' standard. The implementation of deep learning models facilitates the combination of NMI with various imaging modalities, thereby broadening the clinical applications of nuclear medicine. In summary, our review underscores the significant potential of deep learning in NMI, indicating that synthetic image generation may be essential for addressing the existing limitations of NMI and improving patient outcomes.

Keywords: nuclear medicine imaging; synthesizing; transforming.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Figure 1
Figure 1
(a) PRISMA flow diagram of included studies. (b) Number of publications on NMI with deep learning from 2017 to 2024 on PubMed, Scopus, IEEE, and ArXiv.
Figure 2
Figure 2
Overview of the principle of current NMI processes with DL assistance for professional clinicians. (a) Dose measuring and patient safety balance, (b) scanning geometry (scintigraphy/SPECT/PET), (c) reconstruction and visualization (2D/3D), and (d) data and study cases. The process is enhanced by incorporating DL methods, updating the knowledge of professional clinicians, and utilizing popular reports based on study cases.
Figure 3
Figure 3
Schematic overview of synthetic image generation in nuclear medicine using DL.
Figure 4
Figure 4
Different general DL models for image synthesis. (a) FCNs—convolutional layers, (b) VAEs—encode and decode in latent variables, (c) GANs—generator and discriminator layers, and (d) diffuse model—forwarding and reversing diffusion sequence by adding/removing noise.
Figure 5
Figure 5
Selected research articles in NMI by publication years and imaging area targets.
Figure 6
Figure 6
(a) (i) 3D residual U-Net architecture for generating synthetic PET images from MRI images, exhibiting significant stitching artifacts (blue arrows) in (ii) T1w-MRI between bed positions as well as loss in resolution in the head (green arrows); with different cross-sections denoted by blue arrows. Reproduced with permission from Rajagopal et al. [62]; published by IEEE Xplore, 2022. (b) (i) Overview of CycleGAN framework for MRI-to-PET translation, and (ii) visualization of ground-truth and synthetic PETs. Reproduced with permission from Khojaste-Sarakhsi et al. [68]; published by Image and Vision Computing—ScienceDirect, 2024. (c) (i) Novel joint learning framework combining unsupervised cross-modal synthesis and diagnosis for Alzheimer’s disease by mining underlying shared modality information to improve performance. Qualitative results of (ii) different cross-modal synthesis networks with SUV ratio error map between real PET image and synthesized PET image. Reproduced with permission from Wang et al. [72]; published by Medical Image Analysis—ScienceDirect, 2024. (d) (i) Dense U-Net architecture, and (ii) FDG-PET based tau-PET synthesis result. Reproduced with permission from Lee et al. [73]; published by Brain—Oxford Academic, 2024.
Figure 7
Figure 7
(a) (i) cGAN to synthesize PET images from CT scans. Validation of synthetic PET images based on (ii) MDA-TEST and (iii) TCIA-STANFORD testing cohorts; (iv) imaging quality difference; (v) tumor contrast difference. Reproduced with permission from Salehjahromi et al. [70]; published by Cell Reports Medicine—Elsevier, 2024. (b) (i) UNETR architecture to generate MS-based dose maps. (ii) Relative absolute error maps in cross-section views of SSV/MSV/DL methods. Reproduced with permission from Mansouri et al. [69]; published by EJNMMI—Springer Nature, 2024.
Figure 8
Figure 8
(a) (i) pix2pixHD to generate AI-ExtremePET from ExtremePET. (ii) Comparison between different cross-sections. Reproduced with permission from Hosch et al. [75]; published by EJNMMI—Springer, 2022. (b) (i) DDPET-3D model to generate synthetic full-dose PET volume from a low-count PET volume. (ii) Represents the whole-body/brain’s cross-section between synthetic and ground-truth PETs. Reproduced with permission from Xie et al. [76]; published by AxRiv, 2024. (c) (i) Modified pix2pix model and (ii) comparison of representative PET images in specific regions with error map. Reproduced with permission from Li et al. [77]; published by the European Journal of Radiology—ScienceDirect, 2022. (d) (i) Synthetic PET images produced by training a 2D pix2pix model, overlaid on CT and compared with (ii) original CT, (iii) original AC-PET on CT, (iv) V1-PET on CT, and (v) V2-PET on CT. Reproduced with permission from Ma et al. [78]; published by Oncotarget—Impact Journals, 2024.

Similar articles

Cited by

References

    1. Könik A., O’Donoghue J.A., Wahl R.L., Graham M.M., Van den Abbeele A.D. Theranostics: The Role of Quantitative Nuclear Medicine Imaging. Semin. Radiat. Oncol. 2021;31:28–36. doi: 10.1016/j.semradonc.2020.07.003. - DOI - PubMed
    1. Wahl R.L. Progress in Nuclear Medicine Imaging of Cancers. Prim. Care Clin. Off. Pract. 1998;25:341–360. doi: 10.1016/S0095-4543(05)70068-3. - DOI - PubMed
    1. Le D. An Overview of the Regulations of Radiopharmaceuticals. In: Wong F.C.L., editor. Locoregional Radionuclide Cancer Therapy: Clinical and Scientific Aspects. Springer International Publishing; Cham, Switzerland: 2021. pp. 225–247.
    1. Mariani G., Bruselli L., Kuwert T., Kim E.E., Flotats A., Israel O., Dondi M., Watanabe N. A Review on the Clinical Uses of SPECT/CT. Eur. J. Nucl. Med. Mol. Imaging. 2010;37:1959–1985. doi: 10.1007/s00259-010-1390-8. - DOI - PubMed
    1. Townsend D.W., Carney J.P.J., Yap J.T., Hall N.C. PET/CT Today and Tomorrow. J. Nucl. Med. 2004;45:4S–14S. - PubMed

Substances

LinkOut - more resources