Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 May;3(3):e200024.
doi: 10.1148/rycan.2021200024.

Segmentation of the Prostate Transition Zone and Peripheral Zone on MR Images with Deep Learning

Affiliations

Segmentation of the Prostate Transition Zone and Peripheral Zone on MR Images with Deep Learning

Michelle Bardis et al. Radiol Imaging Cancer. 2021 May.

Abstract

Purpose To develop a deep learning model to delineate the transition zone (TZ) and peripheral zone (PZ) of the prostate on MR images. Materials and Methods This retrospective study was composed of patients who underwent a multiparametric prostate MRI and an MRI/transrectal US fusion biopsy between January 2013 and May 2016. A board-certified abdominal radiologist manually segmented the prostate, TZ, and PZ on the entire data set. Included accessions were split into 60% training, 20% validation, and 20% test data sets for model development. Three convolutional neural networks with a U-Net architecture were trained for automatic recognition of the prostate organ, TZ, and PZ. Model performance for segmentation was assessed using Dice scores and Pearson correlation coefficients. Results A total of 242 patients were included (242 MR images; 6292 total images). Models for prostate organ segmentation, TZ segmentation, and PZ segmentation were trained and validated. Using the test data set, for prostate organ segmentation, the mean Dice score was 0.940 (interquartile range, 0.930-0.961), and the Pearson correlation coefficient for volume was 0.981 (95% CI: 0.966, 0.989). For TZ segmentation, the mean Dice score was 0.910 (interquartile range, 0.894-0.938), and the Pearson correlation coefficient for volume was 0.992 (95% CI: 0.985, 0.995). For PZ segmentation, the mean Dice score was 0.774 (interquartile range, 0.727-0.832), and the Pearson correlation coefficient for volume was 0.927 (95% CI: 0.870, 0.957). Conclusion Deep learning with an architecture composed of three U-Nets can accurately segment the prostate, TZ, and PZ. Keywords: MRI, Genital/Reproductive, Prostate, Neural Networks Supplemental material is available for this article. © RSNA, 2021.

Keywords: Genital/Reproductive; MRI; Neural Networks; Prostate.

PubMed Disclaimer

Conflict of interest statement

Disclosures of Conflicts of Interest: M.B. Activities related to the present article: disclosed no relevant relationships. Activities not related to the present article: author is a recipient of Radiological Society of North America Medical Student Research Grant (RMS1902) and recipient of Alpha Omega Alpha Carolyn L. Kuckein Student Research Fellowship. Other relationships: disclosed no relevant relationships. R.H. disclosed no relevant relationships. C. Chantaduly disclosed no relevant relationships. K.T.H. disclosed no relevant relationships. A.U. disclosed no relevant relationships. C. Chahine disclosed no relevant relationships. M.R. disclosed no relevant relationships. D.C. Activities related to the present article: disclosed no relevant relationships. Activities not related to the present article: author received consultancy fees from Canon Medical; author is employed by University of California, Irvine; author received money from Cullins and Grandy for expert testimony; author has stock/stock options in Avicenna.ai. Other relationships: disclosed no relevant relationships. P.C. Activities related to the present article: disclosed no relevant relationships. Activities not related to the present article: author received payment for lectures including service on speakers bureaus from Canon Medical; author is a cofounder of and has stock/stock options in Avicenna.ai. Other relationships: disclosed no relevant relationships.

Figures

A, Axial T2-weighted section of a prostate with marked benign
prostatic hyperplasia. B, The transition zone is shown in green, while, C,
the peripheral zone is displayed in pink.
Figure 1:
A, Axial T2-weighted section of a prostate with marked benign prostatic hyperplasia. B, The transition zone is shown in green, while, C, the peripheral zone is displayed in pink.
Three convolutional neural networks were combined to segment the
transition zone (TZ) and the peripheral zone (PZ). The T2-weighted prostate
MR image is first passed through U-NetA, which performs localization and
creates a bounding box around the prostate that narrows the field of view.
This output is then passed simultaneously to U-NetB and U-NetC. U-NetB
completes further localization by segmenting out the prostate organ itself,
while U-NetC classifies each pixel as either TZ or PZ. U-NetB’s
output is then used to identify the voxels in U-NetC that belong in the
prostate. After the appropriate voxels are selected in U-NetC’s
output, the final output with TZ and PZ segmentations is shown.
Figure 2:
Three convolutional neural networks were combined to segment the transition zone (TZ) and the peripheral zone (PZ). The T2-weighted prostate MR image is first passed through U-NetA, which performs localization and creates a bounding box around the prostate that narrows the field of view. This output is then passed simultaneously to U-NetB and U-NetC. U-NetB completes further localization by segmenting out the prostate organ itself, while U-NetC classifies each pixel as either TZ or PZ. U-NetB’s output is then used to identify the voxels in U-NetC that belong in the prostate. After the appropriate voxels are selected in U-NetC’s output, the final output with TZ and PZ segmentations is shown.
The reliability and accuracy of the neural networks to estimate volume
of the prostate, transition zone, and peripheral zone are shown with
Bland-Altman plots. A–C, The dashed black line represents the average
difference. The dashed blue lines show 2 standard deviations of the
difference above and below the average difference. CNN = convolutional
neural network, GT = ground truth.
Figure 3:
The reliability and accuracy of the neural networks to estimate volume of the prostate, transition zone, and peripheral zone are shown with Bland-Altman plots. A–C, The dashed black line represents the average difference. The dashed blue lines show 2 standard deviations of the difference above and below the average difference. CNN = convolutional neural network, GT = ground truth.
A, U-NetA neural network architecture. U-NetA localizes the prostate
by creating a bounding box around it and narrows the field of view (see
Appendix E1 [supplement]). The image is processed with nine layers that
consist of convolutions (Convs), batch normalization (Batch Norm), and
rectified linear unit (ReLU) activation. Both the contraction and expansion
pathways use convolutional kernels that have 1 × 3 × 3 and 2
× 1 × 1 filters. The image is downsampled to a 1 × 1
× 1 matrix before it is upsampled. B, U-NetB neural network
architecture. U-NetB completes prostate organ segmentation by classifying
each pixel as either belonging to the prostate or background. The image is
processed with six layers that consist of convolutions, batch normalization,
and ReLU activation. The contraction and expansion pathways use
convolutional kernels that have 1 × 3 × 3 and 2 × 1
× 1 filters. The image is collapsed to a 1 × 8 × 8
image before it is expanded. C, U-NetC neural network architecture. U-NetC
differentiates between transition zone and peripheral zone by classifying
every voxel in the image as one of these two classes. This classification
then identifies the border between these two prostate regions. U-NetC has
the same architecture as U-NetB and implements six layers that perform
convolutions, batch normalization, and ReLU activation. The convolutional
kernels use 1 × 3 × 3 and 2 × 1 × 1 filters. The
image is downsampled to a 1 × 8 × 8 image before it is
upsampled. 3D = three dimensional, 2D = two dimensional.
Figure 4:
A, U-NetA neural network architecture. U-NetA localizes the prostate by creating a bounding box around it and narrows the field of view (see Appendix E1 [supplement]). The image is processed with nine layers that consist of convolutions (Convs), batch normalization (Batch Norm), and rectified linear unit (ReLU) activation. Both the contraction and expansion pathways use convolutional kernels that have 1 × 3 × 3 and 2 × 1 × 1 filters. The image is downsampled to a 1 × 1 × 1 matrix before it is upsampled. B, U-NetB neural network architecture. U-NetB completes prostate organ segmentation by classifying each pixel as either belonging to the prostate or background. The image is processed with six layers that consist of convolutions, batch normalization, and ReLU activation. The contraction and expansion pathways use convolutional kernels that have 1 × 3 × 3 and 2 × 1 × 1 filters. The image is collapsed to a 1 × 8 × 8 image before it is expanded. C, U-NetC neural network architecture. U-NetC differentiates between transition zone and peripheral zone by classifying every voxel in the image as one of these two classes. This classification then identifies the border between these two prostate regions. U-NetC has the same architecture as U-NetB and implements six layers that perform convolutions, batch normalization, and ReLU activation. The convolutional kernels use 1 × 3 × 3 and 2 × 1 × 1 filters. The image is downsampled to a 1 × 8 × 8 image before it is upsampled. 3D = three dimensional, 2D = two dimensional.
Example segmentations in three patients with A, transition zone (TZ)
Dice score of 0.940 and a peripheral zone (PZ) Dice score of 0.902, B, TZ
Dice score of 0.910 and a PZ Dice score of 0.869, and C, TZ Dice score of
0.978 and a PZ Dice score of 0.907. CNN = convolutional neural network.
Green and pink borders indicate TZ and PZ, respectively.
Figure 5:
Example segmentations in three patients with A, transition zone (TZ) Dice score of 0.940 and a peripheral zone (PZ) Dice score of 0.902, B, TZ Dice score of 0.910 and a PZ Dice score of 0.869, and C, TZ Dice score of 0.978 and a PZ Dice score of 0.907. CNN = convolutional neural network. Green and pink borders indicate TZ and PZ, respectively.
Examples of challenging segmentations. A, Segmentation resulted in a
peripheral zone (PZ) Dice score of 0.642 and a transition zone (TZ) Dice
score of 0.893. B, Segmentation resulted in a PZ Dice score of 0.669 and a
TZ Dice score of 0.955. Both the TZ and PZ are more challenging for the
neural networks to segment at the prostate base. Green and pink borders
indicate TZ and PZ, respectively. CNN = convolutional neural
network.
Figure 6:
Examples of challenging segmentations. A, Segmentation resulted in a peripheral zone (PZ) Dice score of 0.642 and a transition zone (TZ) Dice score of 0.893. B, Segmentation resulted in a PZ Dice score of 0.669 and a TZ Dice score of 0.955. Both the TZ and PZ are more challenging for the neural networks to segment at the prostate base. Green and pink borders indicate TZ and PZ, respectively. CNN = convolutional neural network.

Similar articles

Cited by

References

    1. Siegel RL, Miller KD, Jemal A. Cancer statistics, 2016. CA Cancer J Clin 2016;66(1):7–30. - PubMed
    1. Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, Jemal A. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin 2018;68(6):394–42.[Published correction appears in CA Cancer J Clin 2020;70(4):313.]. - PubMed
    1. Schröder FH, Hugosson J, Roobol MJ, et al. . Screening and prostate-cancer mortality in a randomized European study. N Engl J Med 2009;360(13):1320–1328. - PubMed
    1. Hugosson J, Carlsson S. Overdetection in screening for prostate cancer. Curr Opin Urol 2014;24(3):256–263. - PubMed
    1. Bangma CH, Roemeling S, Schröder FH. Overdiagnosis and overtreatment of early detected prostate cancer. World J Urol 2007;25(1):3–9. - PMC - PubMed

Publication types