Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Observational Study
. 2022 Aug 12;12(1):13710.
doi: 10.1038/s41598-022-17976-5.

Deep learning-based diagnosis from endobronchial ultrasonography images of pulmonary lesions

Affiliations
Observational Study

Deep learning-based diagnosis from endobronchial ultrasonography images of pulmonary lesions

Takamasa Hotta et al. Sci Rep. .

Abstract

Endobronchial ultrasonography with a guide sheath (EBUS-GS) improves the accuracy of bronchoscopy. The possibility of differentiating benign from malignant lesions based on EBUS findings may be useful in making the correct diagnosis. The convolutional neural network (CNN) model investigated whether benign or malignant (lung cancer) lesions could be predicted based on EBUS findings. This was an observational, single-center cohort study. Using medical records, patients were divided into benign and malignant groups. We acquired EBUS data for 213 participants. A total of 2,421,360 images were extracted from the learning dataset. We trained and externally validated a CNN algorithm to predict benign or malignant lung lesions. Test was performed using 26,674 images. The dataset was interpreted by four bronchoscopists. The accuracy, sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of the CNN model for distinguishing benign and malignant lesions were 83.4%, 95.3%, 53.6%, 83.8%, and 82.0%, respectively. For the four bronchoscopists, the accuracy rate was 68.4%, sensitivity was 80%, specificity was 39.6%, PPV was 76.8%, and NPV was 44.2%. The developed EBUS-computer-aided diagnosis system is expected to read EBUS findings that are difficult for clinicians to judge with precision and help differentiate between benign lesions and lung cancers.

PubMed Disclaimer

Conflict of interest statement

NK has received personal fees from Olympus Medical Systems Corporation outside the work performed. YT reports personal fees from AstraZeneca, Daiichi Sankyo Co., Ltd., Pfizer Health Research Foundation, and Chugai Pharmaceutical Co., Ltd., outside the work performed. TI has received personal fees from Boehringer-Ingelheim, AstraZeneca, Daiichi Sankyo Co. Ltd, Pearl Therapeutics Inc., Janssen Pharmaceutical K.K., and Pfizer outside the work performed. TH, YS, YA, MH, and AT have no conflicts of interest to declare. We built a CNN-CAD system in partnership with Olympus Medical Systems Corporation. However, there is no economic benefit to Olympus Medical Systems Corporation in this study. This work was supported by JSPS KAKENHI Grant Number JP22K18185.

Figures

Figure 1
Figure 1
Convolutional neural network architecture in this study. Feature extraction using CNN consists of two stages, with each stage consisting of multiple blocks and one pooling layer. The first stage consists of 11 blocks and one pooling layer, while the second consists of 16 blocks and one global average pooling layer. One block consists of a convolution layer (Conv), batch normalization layer (BN), and rectified linear unit (ReLU) function. Conv is a dilated convolution. The size of the kernel is 3. Both the dilation size and padding size are 3. The number of channels of Conv in the first and second stages is 135 and 270, respectively. The classification neural network is composed of a fully connected layer (FC) and a softmax layer. The figure was generated by PlotNeuralNet and modified. PlotNeuralNet v1.0.0 (https://github.com/HarisIqbal88/PlotNeuralNet) is released under the MIT License (https://opensource.org/licenses/mit-license.php).
Figure 2
Figure 2
Data preprocessing flow and analysis data breakdown in this study. Training image dataset were 55,376 images of 76 adenocarcinomas, 27,038 images of 41 squamous cell carcinomas, 5136 images of 10 small cell carcinomas, and 33,518 images of 44 benign lesions. Data augmentation was applied to the training image dataset. Test dataset were 11,650 images of 16 adenocarcinomas, 4473 images of 9 squamous cell carcinomas, 2952 images of 5 small cell carcinomas, and 7599 images of 12 benign lesions. The ratio for training and test datasets was 80:20. AD adenocarcinoma, SCC squamous cell carcinomas, SCLC small cell lung cancer.
Figure 3
Figure 3
Accuracy for each case. For each case, when the ratio of images estimated to be correct was 50% or more, it was judged to be correct. When the endobronchial ultrasound visualization was adjacent to case, the graph showed a sprite pattern.
Figure 4
Figure 4
Visualization techniques; Gradient-weighted Class Activation Mapping. Areas suspected of being malignant are shown in red and areas suspected to be benign in blue.

References

    1. Kurimoto N, et al. Endobronchial ultrasonography using a guide sheath increases the ability to diagnose peripheral pulmonary lesions endoscopically. Chest. 2004;126:959–965. doi: 10.1378/chest.126.3.959. - DOI - PubMed
    1. Chao TY, et al. Endobronchial ultrasonography-guided transbronchial needle aspiration increases the diagnostic yield of peripheral pulmonary lesions: a randomized trial. Chest. 2009;136:229–236. doi: 10.1378/chest.08-0577. - DOI - PubMed
    1. Chen A, et al. Radial probe endobronchial ultrasound for peripheral pulmonary lesions. A 5-year institutional experience. Ann. Am. Thorac. Soc. 2014;11:578–582. doi: 10.1513/AnnalsATS.201311-384OC. - DOI - PubMed
    1. Kurimoto N, Murayama M, Yoshioka S, Nishisaka T. Analysis of the internal structure of peripheral pulmonary lesions using endobronchial ultrasonography. Chest. 2002;122:1887–1894. doi: 10.1378/chest.122.6.1887. - DOI - PubMed
    1. Becker AS, et al. Classification of breast cancer in ultrasound imaging using a generic deep learning analysis software: a pilot study. Br. J. Radiol. 2018;91:20170576. doi: 10.1259/bjr.20170576. - DOI - PMC - PubMed

Publication types