Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Jul;48(7):3916-3926.
doi: 10.1002/mp.14946. Epub 2021 Jun 2.

Self-supervised learning for accelerated 3D high-resolution ultrasound imaging

Affiliations

Self-supervised learning for accelerated 3D high-resolution ultrasound imaging

Xianjin Dai et al. Med Phys. 2021 Jul.

Abstract

Purpose: Ultrasound (US) imaging has been widely used in diagnosis, image-guided intervention, and therapy, where high-quality three-dimensional (3D) images are highly desired from sparsely acquired two-dimensional (2D) images. This study aims to develop a deep learning-based algorithm to reconstruct high-resolution (HR) 3D US images only reliant on the acquired sparsely distributed 2D images.

Methods: We propose a self-supervised learning framework using cycle-consistent generative adversarial network (cycleGAN), where two independent cycleGAN models are trained with paired original US images and two sets of low-resolution (LR) US images, respectively. The two sets of LR US images are obtained through down-sampling the original US images along the two axes, respectively. In US imaging, in-plane spatial resolution is generally much higher than through-plane resolution. By learning the mapping from down-sampled in-plane LR images to original HR US images, cycleGAN can generate through-plane HR images from original sparely distributed 2D images. Finally, HR 3D US images are reconstructed by combining the generated 2D images from the two cycleGAN models.

Results: The proposed method was assessed on two different datasets. One is automatic breast ultrasound (ABUS) images from 70 breast cancer patients, the other is collected from 45 prostate cancer patients. By applying a spatial resolution enhancement factor of 3 to the breast cases, our proposed method achieved the mean absolute error (MAE) value of 0.90 ± 0.15, the peak signal-to-noise ratio (PSNR) value of 37.88 ± 0.88 dB, and the visual information fidelity (VIF) value of 0.69 ± 0.01, which significantly outperforms bicubic interpolation. Similar performances have been achieved using the enhancement factor of 5 in these breast cases and using the enhancement factors of 5 and 10 in the prostate cases.

Conclusions: We have proposed and investigated a new deep learning-based algorithm for reconstructing HR 3D US images from sparely acquired 2D images. Significant improvement on through-plane resolution has been achieved by only using the acquired 2D images without any external atlas images. Its self-supervision capability could accelerate HR US imaging.

Keywords: cycle-consistent generative adversarial network; deep learning; image-guided therapy; self-supervised learning; ultrasound imaging.

PubMed Disclaimer

Conflict of interest statement

CONFLICT OF INTEREST

The authors declare no conflict of interest.

Figures

Fig. 1.
Fig. 1.
The schematic diagram of the proposed self-supervised learning framework. cycleGAN: cycle-consistent generative adversarial network.
Fig. 2.
Fig. 2.
The network architecture of cycleGAN. LR, low-resolution image; HR, high-resolution image; Syn., synthetic; Cyc., cycle; Conv., convolution; DeConv., deconvolution.
Fig. 3.
Fig. 3.
An illustrative example of generating high-resolution breast ultrasound images with a factor of 3. (a1–a6) show the axial views with the red-box-area zoom-in images (b1–b6), (c1–c6) are the sagittal-view images and their green-box-area zoom-in images (d1–d6), and (e1–e6) show the coronal views with the blue-box-are zoom-in images (f1–f6). GT, ground truth; LR, down-sampled low-resolution images; Proposed, images predicted by the proposed method; Bicubic, images obtained through bicubic interpolation; Proposed – GT, the differences between the proposed method predictions and the ground truth; Bicubic – GT, the differences between the images by bicubic interpolation and the ground truth.
Fig. 4.
Fig. 4.
One case of generating high-resolution breast ultrasound images with a down-sampling factor of 5. (a1–a6) are the axial-view images with the red-boxarea zoom-in images (b1–b6), (c1–c6) show the sagittal-view images and their green-box-area zoom-in images (d1–d6), and (e1–e6) are the coronal views with the blue-box-are zoom-in images (f1–f6). GT, ground truth; LR, down-sampled low-resolution images; Proposed, images predicted by the proposed method; Bicubic, images obtained through bicubic interpolation; Proposed – GT, the differences between the proposed method predictions and the ground truth; Bicubic – GT, the differences between the images by bicubic interpolation and the ground truth.
Fig. 5.
Fig. 5.
One case of generating high-resolution prostate ultrasound images with a factor of 5. (a1–a3) are the axial-view images, (b1–b3) show the sagittal-view images, and (c1–c3) are the coronal views. LR, low-resolution input images; Bicubic, images obtained through bicubic interpolation; Proposed, images predicted by the proposed method.
Fig. 6.
Fig. 6.
One case of generating high-resolution prostate ultrasound images with a factor of 10. (a1–a3) show the axial-view images, (b1–b3) are the sagittal-view images, and (c1–c3) show the coronal-view images. LR, low-resolution input images; Bicubic, images obtained through bicubic interpolation; Proposed, images predicted by the proposed method.
Fig. 7.
Fig. 7.
A validation example of generating high-resolution prostate phantom images with a factor of 5. (a1–a3) show the axial-, sagittal-, and coronal-view images of the ground truth, (b1–b3) are the down-sampled low-resolution images, (c1–c3) show the predictions by our proposed method, (d1–d3) are the results of bicubic interpolation method. GT, ground truth; LR, down-sampled low-resolution images; Proposed, images predicted by the proposed method; Bicubic, images obtained through bicubic interpolation.

References

    1. Rose RJ, Allwin S. Computerized cancer detection and classification using ultrasound images: a survey. Int J Eng Res Developm. 2013;5:36–47.
    1. Sood R, Rositch AF, Shakoor D, et al. Ultrasound for breast cancer detection globally: a systematic review and meta-analysis [published online ahead of print 2019/08/28]. J Glob Oncol. 2019;5:1–17. - PMC - PubMed
    1. Guo R, Lu G, Qin B, Fei B. ultrasound imaging technologies for breast cancer detection and management: a review [published online ahead of print 2017/11/07]. Ultrasound Med Biol. 2018;44:37–70. - PMC - PubMed
    1. Klibanov AL, Hossack JA. Ultrasound in radiology: from anatomic, functional, molecular imaging to drug delivery and image-guided therapy [published online ahead of print 2015/07/23]. Invest Radiol. 2015;50:657–670. - PMC - PubMed
    1. Wang S, Hossack JA, Klibanov AL. From anatomy to functional and molecular biomarker imaging and therapy: ultrasound is safe, ultrafast, portable, and inexpensive. Invest Radiol. 2020;55:559–572. - PMC - PubMed

LinkOut - more resources