Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2025 Jun;224(6):e2532681.
doi: 10.2214/AJR.25.32681. Epub 2025 Jun 18.

Interreader Agreement of Lung-RADS: A Systematic Review and Meta-Analysis

Affiliations
Review

Interreader Agreement of Lung-RADS: A Systematic Review and Meta-Analysis

Jisun Hwang et al. AJR Am J Roentgenol. 2025 Jun.

Abstract

BACKGROUND. Lung-RADS has shown variable interreader agreement in the literature, in part related to a broad range of factors that may influence the consistency of its implementation. OBJECTIVE. The purpose of this study was to assess the interreader agreement of Lung-RADS and to investigate factors influencing the system's variability. EVIDENCE ACQUISITION. The Embase, PubMed, and Cochrane databases were searched for original research studies published through June 18, 2024, that reported the interreader agreement of Lung-RADS on chest CT. Random-effects models were used to calculate pooled kappa coefficients for Lung-RADS categorization and pooled intraclass correlation coefficients (ICCs) for nodule size measurements. Potential sources of heterogeneity were explored using metaregression analyses. EVIDENCE SYNTHESIS. The analysis included 11 studies (1470 patients) for Lung-RADS categorization and five studies (617 patients) for nodule size measurement. Interreader agreement for Lung-RADS categorization was substantial (κ = 0.72 [95% CI, 0.57-0.82]), and that for nodule size measurement was almost perfect (ICC = 0.97 [95% CI, 0.90-0.99]). Interreader agreement for Lung-RADS categorization was significantly associated with the method of nodule measurement (p = .005), with pooled kappa coefficients of 0.95, 0.91, and 0.66 for studies using computer-aided detection (CAD)-based semiautomated volume measurements, CAD-based semiautomated diameter measurements, and manual diameter measurements, respectively. Interreader agreement for Lung-RADS categorization was also significantly associated with studies' nodule type distribution (p < .001), with pooled kappa coefficients of 0.85, 0.76, and 0.55 for studies evaluating 100% solid nodules, 30-99% solid nodules, and fewer than 30% solid nodules, respectively. Interreader agreement for nodule size measurement was significantly associated with radiation dose (p < .001), with pooled ICCs of 0.97, 0.96, and 0.59 for studies that used standard-dose CT, low-dose CT, and ultralow-dose CT, respectively. Interreader agreement for nodule size measurement was also significantly associated with the Lung-RADS version that was used (p = .02), with pooled ICCs of 0.99 and 0.93 for studies using Lung-RADS 1.1 and Lung-RADS 1.0, respectively. CONCLUSION. Although they support the overall reliability of Lung-RADS, the findings indicate roles for CAD assistance as well as training and standardized approaches for nodule type characterization to further promote reproducible application. CLINICAL IMPACT. Consistent nodule assessments will be critical for Lung-RADS to optimally impact patient management and outcomes.

Keywords: CT; cancer screening; lung neoplasms; observer variation; systematic review.

PubMed Disclaimer

Comment in

Similar articles

MeSH terms