Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017:2017:9283480.
doi: 10.1155/2017/9283480. Epub 2017 Jun 13.

Low-Grade Glioma Segmentation Based on CNN with Fully Connected CRF

Affiliations

Low-Grade Glioma Segmentation Based on CNN with Fully Connected CRF

Zeju Li et al. J Healthc Eng. 2017.

Abstract

This work proposed a novel automatic three-dimensional (3D) magnetic resonance imaging (MRI) segmentation method which would be widely used in the clinical diagnosis of the most common and aggressive brain tumor, namely, glioma. The method combined a multipathway convolutional neural network (CNN) and fully connected conditional random field (CRF). Firstly, 3D information was introduced into the CNN which makes more accurate recognition of glioma with low contrast. Then, fully connected CRF was added as a postprocessing step which purposed more delicate delineation of glioma boundary. The method was applied to T2flair MRI images of 160 low-grade glioma patients. With 59 cases of data training and manual segmentation as the ground truth, the Dice similarity coefficient (DSC) of our method was 0.85 for the test set of 101 MRI images. The results of our method were better than those of another state-of-the-art CNN method, which gained the DSC of 0.76 for the same dataset. It proved that our method could produce better results for the segmentation of low-grade gliomas.

PubMed Disclaimer

Figures

Figure 1
Figure 1
The flow chart of our method.
Figure 2
Figure 2
Two network structures which can make better use of 3D information. The implementation structure of early fusion is shown above, and the figure below is the diagram of late fusion.
Figure 3
Figure 3
The training parameters of different network structures. (a) shows the convergence of the objective function during training, and (b) shows the error change during training.
Figure 4
Figure 4
Boxplots of all data segmentation results of different CNN structures. The first of each group represents the base line, the second represents the deeper structure of the network, the third represents the network with more neurons in the fully connected layers, and the last one is the network with both deeper and more neurons in the fully connected layers.
Figure 5
Figure 5
Segmentation results for networks with different depths. Each row corresponds to a case, and each column corresponds to a network structure of the segmentation results.
Figure 6
Figure 6
Features in networks of case one in Figure 4. Each column corresponds to a network structure, and each row corresponds to one kind of depth; the output of the last filter in the filter bank was selected.
Figure 7
Figure 7
Boxplots of all data segmentation results of different CNN structures combined with fully connected CRF. The first plots of each group represent results of one way CNN, the second plots represent results of one way CNN connected to CRF, the third plots represent results of the early fusion structure with CRF, and the last ones are the results of the late fusion structure with CRF.
Figure 8
Figure 8
Segmentation results of different network structures combined with the fully connected CRF. Each row corresponds to a case, and each column corresponds to a network structure of the segmentation results. The 3D reconstructions of tumors are shown in the second row of each group.
Figure 9
Figure 9
The score maps that are entered into the CRF of different network structures. Each row corresponds to a case and each column corresponds to a network structure.
Algorithm 1
Algorithm 1
Algorithm 1: Proposed segmentation method.

Comment in

References

    1. Furnari F. B., Tim F., Bachoo R. M., et al. Malignant astrocytic glioma: genetics, biology, and paths to treatment. Genes and Development. 2007;21(21):2683–2710. doi: 10.1101/gad.1596707. - DOI - PubMed
    1. Louis D. N., Ohgaki H., Wiestler O. D., et al. WHO classification of tumors of the central nervous system. Acta Neuropathologica. 2007;114(2):97–109. doi: 10.1007/s00401-007-0243-4. - DOI - PMC - PubMed
    1. Menze B., Jakab A., Bauer S., et al. The multimodal brain tumor image segmentation benchmark (BRATS) IEEE Transaction on Medical Imaging. 2015;34(2):1993–2024. doi: 10.1109/TMI.2014.2377694. - DOI - PMC - PubMed
    1. Tustison N. J., Avants B. B., Cook P. A., et al. N4ITK: improved N3 bias correction. IEEE Transaction on Medical Imaging. 2010;29(6):1310–1320. doi: 10.1109/TMI.2010.2046908. - DOI - PMC - PubMed
    1. Hsieh T. M., Liu Y. M., Liao C. C., Xiao F., Chiang I. J., Wong J. M. Automatic segmentation of meningioma from non-contrasted brain MRI integrating fuzzy clustering and region growing. BMC Medical Informatics and Decision Making. 2011;11(54):p. 54. doi: 10.1186/1472-6947-11-54. - DOI - PMC - PubMed

Publication types

LinkOut - more resources