Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Aug 31;10(1):14315.
doi: 10.1038/s41598-020-71080-0.

Automatic prostate and prostate zones segmentation of magnetic resonance images using DenseNet-like U-net

Affiliations

Automatic prostate and prostate zones segmentation of magnetic resonance images using DenseNet-like U-net

Nader Aldoj et al. Sci Rep. .

Abstract

Magnetic resonance imaging (MRI) provides detailed anatomical images of the prostate and its zones. It has a crucial role for many diagnostic applications. Automatic segmentation such as that of the prostate and prostate zones from MR images facilitates many diagnostic and therapeutic applications. However, the lack of a clear prostate boundary, prostate tissue heterogeneity, and the wide interindividual variety of prostate shapes make this a very challenging task. To address this problem, we propose a new neural network to automatically segment the prostate and its zones. We term this algorithm Dense U-net as it is inspired by the two existing state-of-the-art tools-DenseNet and U-net. We trained the algorithm on 141 patient datasets and tested it on 47 patient datasets using axial T2-weighted images in a four-fold cross-validation fashion. The networks were trained and tested on weakly and accurately annotated masks separately to test the hypothesis that the network can learn even when the labels are not accurate. The network successfully detects the prostate region and segments the gland and its zones. Compared with U-net, the second version of our algorithm, Dense-2 U-net, achieved an average Dice score for the whole prostate of 92.1± 0.8% vs. 90.7 ± 2%, for the central zone of [Formula: see text]% vs. [Formula: see text] %, and for the peripheral zone of 78.1± 2.5% vs. [Formula: see text]%. Our initial results show Dense-2 U-net to be more accurate than state-of-the-art U-net for automatic segmentation of the prostate and prostate zones.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
Images illustrating of variations in the MR appearance of the prostate gland in 4 different patients (columns), and rows from top to bottom show Prostate, CZ and PZ respectively.
Figure 2
Figure 2
Segmentation results of the classical (first row) and Dense-2 U-net (second row) algorithms. From left to right, ground truth, prostate, CZ and PZ.
Figure 3
Figure 3
Segmentation of the prostate and its zones (Dense-2 U-net) of two examples (A) and (B): Columns from left to right show images the original image with the prostate outlines, predicted masks of prostate, CZ and PZ respectively with their corresponding ground truth and an overlay, and a magnification of the overlap; the rows from top to bottom show two examples (A) and (B)in the 1st and 3rd row and the magnifications on the 2nd and 4th.
Figure 4
Figure 4
Segmentation results of the Dense-2 U- net: (left) ground truth, (middle) predicted segmentation mask, (right) overlap between ground truth and predicted segmentation mask. The top row shows images of the mid-gland, and the bottom row shows images of the apex.
Figure 5
Figure 5
Segmentation results on a weakly annotated dataset. The upper row shows the weakly annotated ground truth while the bottom row shows the accurate prediction of the network (Dense-2 U-net) for prostate, PZ, and CZ from left to right.
Figure 6
Figure 6
Ground truth images (A), segmentation results of Dense-2 (B) and the classical U-net (C) (first row) for the prostate, (second row) for PZ, and (third row) for CZ, where in each of the subfigure, ground truth is on the left side, predicted mask is in the middle and the overlap is on the right side.
Figure 7
Figure 7
Contour consistency: (left) ground truth, (middle) overlap between the classical U-net segmentation mask and the ground truth, and (right) overlap between Dense-2 U-net segmentation mask and the ground truth.
Figure 8
Figure 8
Examples of inaccurate segmentation: the first row shows examples of inaccurate segmentation of the prostate gland while the second row shows examples for the peripheral zone.
Figure 9
Figure 9
Motion artifacts and their negative effect on segmentation. The images in the center are the ground truth and the images on the right are the result of segmentation with the Dense-2 U-net.
Figure 10
Figure 10
Images illustrating several special cases in the dataset. Each raw represents a different case. Red masks indicate the ground truth segmentations, while yellow and blue masks represent the masks generated from classical and Dense-2 U-net respectively, and the last two columns show the overlap between ground truth and predict masks of the aforementioned networks.
Figure 11
Figure 11
Image augmentation using elastic deformation with different parameters values.
Figure 12
Figure 12
The Dense-2 U-net architecture. Numbers in the figure indicate the number of feature maps at each stage.

References

    1. Aldoj N, Lukas S, Dewey M, Penzkofer T. Semi-automatic classification of prostate cancer on multi-parametric MR imaging using a multi-channel 3D convolutional neural network. Eur. Radiol. 2020;30:1243–1253. doi: 10.1007/s00330-019-06417-z. - DOI - PubMed
    1. Siegel R. L, Miller K. D, Jemal A. Cancer statistics, 2016. CA Cancer J. Clin. 2016;66:7–30. doi: 10.3322/caac.21332. - DOI - PubMed
    1. Wang Y, et al. Towards personalized statistical deformable model and hybrid point matching for robust MR-TRUS registration. IEEE Trans. Med. Imaging. 2016;35:589–604. doi: 10.1109/TMI.2015.2485299. - DOI - PubMed
    1. Terris MK, Stamey TA. Determination of prostate volume by transrectal ultrasound. J. Urol. 1991;145:984–987. doi: 10.1016/S0022-5347(17)38508-7. - DOI - PubMed
    1. Zettinig O, et al. Multimodal image-guided prostate fusion biopsy based on automatic deformable registration. Int. J. Comput. Assist. Radiol. Surg. 2015;10:1997–2007. doi: 10.1007/s11548-015-1233-y. - DOI - PubMed

Publication types