Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Multicenter Study
. 2020 Nov 2;3(11):e2027426.
doi: 10.1001/jamanetworkopen.2020.27426.

Evaluation of Deep Learning to Augment Image-Guided Radiotherapy for Head and Neck and Prostate Cancers

Affiliations
Multicenter Study

Evaluation of Deep Learning to Augment Image-Guided Radiotherapy for Head and Neck and Prostate Cancers

Ozan Oktay et al. JAMA Netw Open. .

Erratum in

  • Error in Abstract.
    [No authors listed] [No authors listed] JAMA Netw Open. 2020 Dec 1;3(12):e2032624. doi: 10.1001/jamanetworkopen.2020.32624. JAMA Netw Open. 2020. PMID: 33295967 Free PMC article. No abstract available.

Abstract

Importance: Personalized radiotherapy planning depends on high-quality delineation of target tumors and surrounding organs at risk (OARs). This process puts additional time burdens on oncologists and introduces variability among both experts and institutions.

Objective: To explore clinically acceptable autocontouring solutions that can be integrated into existing workflows and used in different domains of radiotherapy.

Design, setting, and participants: This quality improvement study used a multicenter imaging data set comprising 519 pelvic and 242 head and neck computed tomography (CT) scans from 8 distinct clinical sites and patients diagnosed either with prostate or head and neck cancer. The scans were acquired as part of treatment dose planning from patients who received intensity-modulated radiation therapy between October 2013 and February 2020. Fifteen different OARs were manually annotated by expert readers and radiation oncologists. The models were trained on a subset of the data set to automatically delineate OARs and evaluated on both internal and external data sets. Data analysis was conducted October 2019 to September 2020.

Main outcomes and measures: The autocontouring solution was evaluated on external data sets, and its accuracy was quantified with volumetric agreement and surface distance measures. Models were benchmarked against expert annotations in an interobserver variability (IOV) study. Clinical utility was evaluated by measuring time spent on manual corrections and annotations from scratch.

Results: A total of 519 participants' (519 [100%] men; 390 [75%] aged 62-75 years) pelvic CT images and 242 participants' (184 [76%] men; 194 [80%] aged 50-73 years) head and neck CT images were included. The models achieved levels of clinical accuracy within the bounds of expert IOV for 13 of 15 structures (eg, left femur, κ = 0.982; brainstem, κ = 0.806) and performed consistently well across both external and internal data sets (eg, mean [SD] Dice score for left femur, internal vs external data sets: 98.52% [0.50] vs 98.04% [1.02]; P = .04). The correction time of autogenerated contours on 10 head and neck and 10 prostate scans was measured as a mean of 4.98 (95% CI, 4.44-5.52) min/scan and 3.40 (95% CI, 1.60-5.20) min/scan, respectively, to ensure clinically accepted accuracy. Manual segmentation of the head and neck took a mean 86.75 (95% CI, 75.21-92.29) min/scan for an expert reader and 73.25 (95% CI, 68.68-77.82) min/scan for a radiation oncologist. The autogenerated contours represented a 93% reduction in time.

Conclusions and relevance: In this study, the models achieved levels of clinical accuracy within expert IOV while reducing manual contouring time and performing consistently well across previously unseen heterogeneous data sets. With the availability of open-source libraries and reliable performance, this creates significant opportunities for the transformation of radiation treatment planning.

PubMed Disclaimer

Conflict of interest statement

Conflict of Interest Disclosures: Dr Jena reported receiving personal fees from Microsoft during the conduct of the study. Dr Noble reported receiving grants from Cancer Research UK and personal fees from Microsoft Research, Cambridge, during the conduct of the study. No other disclosures were reported.

Figures

Figure 1.
Figure 1.. Qualitative Evaluation of Expert and Autogenerated Contours on Head and Neck Computed Tomography Scans
Figure 2.
Figure 2.. Interexpert Variability In Prostate Contour Annotations
CT indicates computed tomography.
Figure 3.
Figure 3.. Integration of the Proposed Segmentation Models Into Radiotherapy Planning Workflow

References

    1. Pan HY, Haffty BG, Falit BP, et al. . Supply and demand for radiation oncology in the United States: updated projections for 2015 to 2025. Int J Radiat Oncol Biol Phys. 2016;96(3):493-500. doi:10.1016/j.ijrobp.2016.02.064 - DOI - PubMed
    1. Sklan A, Collingridge D. Treating head and neck cancer: for better or for worse? Lancet Oncol. 2017;18(5):570-571. doi:10.1016/S1470-2045(17)30269-3 - DOI - PubMed
    1. Barnett GC, West CM, Dunning AM, et al. . Normal tissue reactions to radiotherapy: towards tailoring treatment dose by genotype. Nat Rev Cancer. 2009;9(2):134-142. doi:10.1038/nrc2587 - DOI - PMC - PubMed
    1. Vorwerk H, Zink K, Schiller R, et al. . Protection of quality and innovation in radiation oncology: the prospective multicenter trial the German Society of Radiation Oncology (DEGRO-QUIRO study). Strahlenther Onkol. 2014;190(5):433-443. doi:10.1007/s00066-014-0634-0 - DOI - PubMed
    1. Cazzaniga LF, Marinoni MA, Bossi A, et al. . Interphysician variability in defining the planning target volume in the irradiation of prostate and seminal vesicles. Radiother Oncol. 1998;47(3):293-296. doi:10.1016/S0167-8140(98)00028-0 - DOI - PubMed

Publication types

MeSH terms