Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 Oct;3(4):044501.
doi: 10.1117/1.JMI.3.4.044501. Epub 2016 Nov 3.

Microscopic medical image classification framework via deep learning and shearlet transform

Affiliations

Microscopic medical image classification framework via deep learning and shearlet transform

Hadi Rezaeilouyeh et al. J Med Imaging (Bellingham). 2016 Oct.

Abstract

Cancer is the second leading cause of death in US after cardiovascular disease. Image-based computer-aided diagnosis can assist physicians to efficiently diagnose cancers in early stages. Existing computer-aided algorithms use hand-crafted features such as wavelet coefficients, co-occurrence matrix features, and recently, histogram of shearlet coefficients for classification of cancerous tissues and cells in images. These hand-crafted features often lack generalizability since every cancerous tissue and cell has a specific texture, structure, and shape. An alternative approach is to use convolutional neural networks (CNNs) to learn the most appropriate feature abstractions directly from the data and handle the limitations of hand-crafted features. A framework for breast cancer detection and prostate Gleason grading using CNN trained on images along with the magnitude and phase of shearlet coefficients is presented. Particularly, we apply shearlet transform on images and extract the magnitude and phase of shearlet coefficients. Then we feed shearlet features along with the original images to our CNN consisting of multiple layers of convolution, max pooling, and fully connected layers. Our experiments show that using the magnitude and phase of shearlet coefficients as extra information to the network can improve the accuracy of detection and generalize better compared to the state-of-the-art methods that rely on hand-crafted features. This study expands the application of deep neural networks into the field of medical image analysis, which is a difficult domain considering the limited medical data available for such analysis.

Keywords: breast cancer; deep neural network; microscopic images; prostate cancer; shearlet transform.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1
Block diagram of our proposed framework consisting of the training and test phases.
Fig. 2
Fig. 2
Prostate tissue samples with different Gleason grade: (a) grade 2 and (b) grade 5.
Fig. 3
Fig. 3
Sample images of breast tissue and their corresponding magnitude and phase of shearlet coefficients from a single subband: (a) original benign, (b) original malignant, (c) magnitude of shearlets for benign, (d) magnitude of shearlets for malignant, (e) phase of shearlets for benign, and (f) phase of shearlets for malignant.
Fig. 4
Fig. 4
HSCs for (a) correctly classified benign and malignant pair, and (b) incorrectly classified pair.
Fig. 5
Fig. 5
Block diagram of our deep neural network. The inputs are RGB images, magnitude of shearlet coefficients from decomposition levels 1 to 5 (Mag1 to Mag5), and phase of shearlet coefficients from decomposition levels 1 to 5 (Phase1 to Phase5). Then they go through separate CNNs and the results are concatenated using a fully connected layer, which sends the final evolved features to softmax for classification.
Fig. 6
Fig. 6
Architecture of our CNN. The input is a 120×120 patch and can be either RGB or magnitude or phase of shearlet coefficients. Then three layers of convolution and pooling are applied on the input back to back to extract abstracts from the input. Finally, a fully connected layer combines the outputs of convolution filters and sends out a single feature vector with the size of 64.
Fig. 7
Fig. 7
Augmented images of a sample breast tissue image from our dataset.
Fig. 8
Fig. 8
Feature evolution: (a) first convolutional layer output features for magnitude of shearlet coefficients from first decomposition level, (b) first convolutional layer output features for magnitude of shearlet coefficients from third decomposition level, (c) second convolutional layer output features for magnitude of shearlet coefficients from first decomposition level, and (d) third (last) convolutional layer output features for magnitude of shearlet coefficients from third decomposition level.
Fig. 9
Fig. 9
ROC curves for breast cancer diagnosis experiment using the best hand-crafted feature extraction method and our best deep neural network results.

References

    1. American Cancer Society, Cancer facts and figures, Technical Report (2016).
    1. Gleason D. F., “Histologic grading of prostate cancer: a perspective,” Human Pathol. 23(3), 273–279 (1992).HPCQA410.1016/0046-8177(92)90108-F - DOI - PubMed
    1. Demir C., Yener B., “Automated cancer diagnosis based on histopathological images: a systematic survey,” Rensselaer Polytechnic Institute, Technical Report (2005).
    1. Boucheron L. E., Manjunath B. S., Harvey N. R., “Use of imperfectly segmented nuclei in the classification of histopathology images of breast cancer,” in 2010 IEEE Int. Conf. on Acoustics, Speech and Signal Processing, pp. 666–669, IEEE; (2010).10.1109/ICASSP.2010.5495124 - DOI
    1. Farjam R., et al. , “Tree-structured grading of pathological images of prostate,” Proc. SPIE 5747, 840–851 (2005).PSISDG10.1117/12.596068 - DOI