Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Jun 7:12:878104.
doi: 10.3389/fonc.2022.878104. eCollection 2022.

Semi-Automatic Prostate Segmentation From Ultrasound Images Using Machine Learning and Principal Curve Based on Interpretable Mathematical Model Expression

Affiliations

Semi-Automatic Prostate Segmentation From Ultrasound Images Using Machine Learning and Principal Curve Based on Interpretable Mathematical Model Expression

Tao Peng et al. Front Oncol. .

Abstract

Accurate prostate segmentation in transrectal ultrasound (TRUS) is a challenging problem due to the low contrast of TRUS images and the presence of imaging artifacts such as speckle and shadow regions. To address this issue, we propose a semi-automatic model termed Hybrid Segmentation Model (H-SegMod) for prostate Region of Interest (ROI) segmentation in TRUS images. H-SegMod contains two cascaded stages. The first stage is to obtain the vertices sequences based on an improved principal curve-based model, where a few radiologist-selected seed points are used as prior. The second stage is to find a map function for describing the smooth prostate contour based on an improved machine learning model. Experimental results show that our proposed model achieved superior segmentation results compared with several other state-of-the-art models, achieving an average Dice Similarity Coefficient (DSC), Jaccard Similarity Coefficient (Ω), and Accuracy (ACC) of 96.5%, 95.2%, and 96.3%, respectively.

Keywords: accurate prostate segmentation; constraint closed polygonal segment model; improved differential evolution-based method; interpretable mathematical model expression; machine learning; principal curve; transrectal ultrasound.

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
The left image is an example of a TRUS image with a clear prostate boundary. The other two TRUS images show examples with weak or incomplete edge information. All TRUS images are from TaiZhou People’s hospital (see Section 3.1).
Figure 2
Figure 2
The flowchart of the proposed model, including the CCPS and IDEML, where the IDEML consists of the IAMCDE and the ABPNN.
Figure 3
Figure 3
Corresponding training and validation results at different hidden neurons. Green, blue, and black curves show the changes of the DSC, Ω, and ACC, with the number of neurons, respectively. Considering that all the metrics have nearly the same trend and obtain the max value at the same neurons, we use the red dotted line to represent the position of obtaining the max values of all the metrics. In this figure, (A, B) show the training and validation results under different neurons, respectively.
Figure 4
Figure 4
Corresponding training and validation results at different epochs. In this figure, (A, B) show the training and validation results under different epochs, respectively.
Figure 5
Figure 5
Performance measures of the proposed model at different evaluation metrics (i.e., DSC, Ω, and ACC), on the testing set consisting of 25 patients. The solid line shows the value of each patient, and the dotted line shows the average value of whole patients.
Figure 6
Figure 6
Visual comparison of prostate segmentation results.
Figure 7
Figure 7
Comparison between CSIM and proposed model.
Figure 8
Figure 8
Here is the worst result of the proposed method. Red and orange arrows point to the intraprostatic calcifications and openings of the prostatic urethra, respectively.
Figure 9
Figure 9
Visual comparison with Unet using extremely case, where the image is mentioned in Figure 1 (right). The first row shows the compared results, and the second row shows the partial magnification display.

Similar articles

References

    1. Lim S, Jun C, Chang D, Petrisor D, Han M, Stoianovici D. Robotic Transrectal Ultrasound-Guided Prostate Biopsy. IEEE Trans BioMed Eng (2019) 66:2527–37. doi: 10.1109/TBME.2019.2891240 - DOI - PMC - PubMed
    1. Karimi D, Zeng Q, Mathur P, Avinash A, Mahdavi S, Spadinger I, et al. . Accurate and Robust Deep Learning-Based Segmentation of the Prostate Clinical Target Volume in Ultrasound Images. Med Imag Anal (2019) 57:186–96. doi: 10.1016/j.media.2019.07.005 - DOI - PubMed
    1. Wang Y, Dou H, Hu X, Zhu L, Yang X, Xu M, et al. . Deep Attentive Features for Prostate Segmentation in 3D Transrectal Ultrasound. In: IEEE Transactions on Medical Imaging, vol. 38. (2019). p. 2768–78. - PubMed
    1. Peng T, Cai T, Wang J. Prostate Segmentation of Ultrasound Images Based on Interpretable-Guided Mathematical Model. In: International Conference on Multimedia Modeling (MMM). Springer, Cham; (2022). p. 166–77.
    1. Peng T, Zhao J, Wang J. Interpretable Mathematical Model-Guided Ultrasound Prostate Contour Extraction Using Data Mining Techniques. In: 2021 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Houston, TX, USA. (2021). p. 1037–44.

LinkOut - more resources