Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Oct 11;12(1):17024.
doi: 10.1038/s41598-022-20653-2.

Ovarian tumor diagnosis using deep convolutional neural networks and a denoising convolutional autoencoder

Affiliations

Ovarian tumor diagnosis using deep convolutional neural networks and a denoising convolutional autoencoder

Yuyeon Jung et al. Sci Rep. .

Erratum in

Abstract

Discrimination of ovarian tumors is necessary for proper treatment. In this study, we developed a convolutional neural network model with a convolutional autoencoder (CNN-CAE) to classify ovarian tumors. A total of 1613 ultrasound images of ovaries with known pathological diagnoses were pre-processed and augmented for deep learning analysis. We designed a CNN-CAE model that removes the unnecessary information (e.g., calipers and annotations) from ultrasound images and classifies ovaries into five classes. We used fivefold cross-validation to evaluate the performance of the CNN-CAE model in terms of accuracy, sensitivity, specificity, and the area under the receiver operating characteristic curve (AUC). Gradient-weighted class activation mapping (Grad-CAM) was applied to visualize and verify the CNN-CAE model results qualitatively. In classifying normal versus ovarian tumors, the CNN-CAE model showed 97.2% accuracy, 97.2% sensitivity, and 0.9936 AUC with DenseNet121 CNN architecture. In distinguishing malignant ovarian tumors, the CNN-CAE model showed 90.12% accuracy, 86.67% sensitivity, and 0.9406 AUC with DenseNet161 CNN architecture. Grad-CAM showed that the CNN-CAE model recognizes valid texture and morphology features from the ultrasound images and classifies ovarian tumors from these features. CNN-CAE is a feasible diagnostic tool that is capable of robustly classifying ovarian tumors by eliminating marks on ultrasound images. CNN-CAE demonstrates an important application value in clinical conditions.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
Ultrasound images before and after removing the marks via the convolutional autoencoder. The first row is the images with marks, and the second row is the image without marks. Example images are from left to right normal, cystadenoma, mature cystic teratoma, endometrioma, and malignancy.
Figure 2
Figure 2
Multiclass classification of ultrasound images. Heat map of the confusion matrices of two highest performance models, DenseNet121 and DenseNet161.
Figure 3
Figure 3
Receiver operating characteristic (ROC) curves of the classification results for two models, DenseNet121 (left) and DenseNet161 (right). The ROC curves are based on the binary results for each class.
Figure 4
Figure 4
Convolutional neural network visualization of ultrasound images via a gradient-weighted class activation map (Grad-CAM). The first two rows (“Before CAE”) show the images with marks and the corresponding Grad-CAM results, and the next two rows (“After CAE”) show the images with marks removed and the corresponding Grad-CAM results.
Figure 5
Figure 5
Study flow chart. A deep learning method, a convolutional autoencoder (CAE), was used in the pre-processing stage, and hyper-parameter tuning was conducted in the deep learning model training process.
Figure 6
Figure 6
The architecture of the convolutional autoencoder model. This model is designed to remove marks on images and generate high-resolution pixels to replace the marks.
Figure 7
Figure 7
Structure of the DenseNet model. DenseNet uses a DenseBlock that employs fewer parameters while enhancing the information flow and gradient flow.

References

    1. Martínez-Más J, et al. Evaluation of machine learning methods with Fourier Transform features for classifying ovarian tumors based on ultrasound images. PLoS ONE. 2019;14:e0219388. doi: 10.1371/journal.pone.0219388. - DOI - PMC - PubMed
    1. Cho BJ, et al. Classification of cervical neoplasms on colposcopic photography using deep learning. Sci. Rep. 2020;10:13652. doi: 10.1038/s41598-020-70490-4. - DOI - PMC - PubMed
    1. Al-Antari MA, Al-Masni MA, Kim TS. Deep learning computer-aided diagnosis for breast lesion in digital mammogram. Adv. Exp. Med. Biol. 2020;1213:59–72. doi: 10.1007/978-3-030-33128-3_4. - DOI - PubMed
    1. Khazendar S, et al. Automated characterisation of ultrasound images of ovarian tumours: The diagnostic accuracy of a support vector machine and image processing with a local binary pattern operator. Facts Views Vis. Obgyn. 2015;7:7–15. - PMC - PubMed
    1. Wen B, et al. 3D texture analysis for classification of second harmonic generation images of human ovarian cancer. Sci. Rep. 2016;6:35734. doi: 10.1038/srep35734. - DOI - PMC - PubMed

Publication types