High-Throughput Classification of Radiographs Using Deep Convolutional Neural Networks
- PMID: 27730417
- PMCID: PMC5267603
- DOI: 10.1007/s10278-016-9914-9
High-Throughput Classification of Radiographs Using Deep Convolutional Neural Networks
Abstract
The study aimed to determine if computer vision techniques rooted in deep learning can use a small set of radiographs to perform clinically relevant image classification with high fidelity. One thousand eight hundred eighty-five chest radiographs on 909 patients obtained between January 2013 and July 2015 at our institution were retrieved and anonymized. The source images were manually annotated as frontal or lateral and randomly divided into training, validation, and test sets. Training and validation sets were augmented to over 150,000 images using standard image manipulations. We then pre-trained a series of deep convolutional networks based on the open-source GoogLeNet with various transformations of the open-source ImageNet (non-radiology) images. These trained networks were then fine-tuned using the original and augmented radiology images. The model with highest validation accuracy was applied to our institutional test set and a publicly available set. Accuracy was assessed by using the Youden Index to set a binary cutoff for frontal or lateral classification. This retrospective study was IRB approved prior to initiation. A network pre-trained on 1.2 million greyscale ImageNet images and fine-tuned on augmented radiographs was chosen. The binary classification method correctly classified 100 % (95 % CI 99.73-100 %) of both our test set and the publicly available images. Classification was rapid, at 38 images per second. A deep convolutional neural network created using non-radiological images, and an augmented set of radiographs is effective in highly accurate classification of chest radiograph view type and is a feasible, rapid method for high-throughput annotation.
Keywords: Artificial neural networks; Chest radiographs; Computer vision; Convolutional neural network; Deep learning; Machine learning; Radiography.
Conflict of interest statement
Compliance with Ethical Standards Competing Interests Alvin Rajkomar reports having received fees as a research advisor from Google. Funding This research did not receive any specific grant from funding organizations.
Figures
References
-
- Peng Y, Jiang Y, Yang C, Brown JB, Antic T, Sethi I, et al. Quantitative analysis of multiparametric prostate Mr images: differentiation between prostate cancer and normal tissue and correlation with gleason score—a computer-aided diagnosis development study. Radiology. 2013;267(3):787–96. doi: 10.1148/radiol.13121454. - DOI - PMC - PubMed
-
- Toriwaki J, Suenaga Y, Negoro T, Fukumura T. Pattern recognition of chest x-ray images. Comput Graph Image Process. 1973;2(3):252–71. doi: 10.1016/0146-664X(73)90005-1. - DOI
-
- Schalekamp S, Van Ginneken B, Karssemeijer N, Schaefer-Prokop C: Chest radiography: New technological developments and their applications. Seminars in respiratory and critical care medicine. Thieme Medical Publishers, 2014 - PubMed
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
