Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2000 Jun;174(6):1769-77.
doi: 10.2214/ajr.174.6.1741769.

Breast Imaging Reporting and Data System: inter- and intraobserver variability in feature analysis and final assessment

Affiliations

Breast Imaging Reporting and Data System: inter- and intraobserver variability in feature analysis and final assessment

W A Berg et al. AJR Am J Roentgenol. 2000 Jun.

Abstract

Objective: We sought to evaluate the use of the Breast Imaging Reporting and Data System (BI-RADS) standardized mammography lexicon among and within observers and to distinguish variability in feature analysis from variability in lesion management.

Materials and methods: Five experienced mammographers, not specifically trained in BI-RADS, used the lexicon to describe and assess 103 screening mammograms, including 30 (29%) showing cancer, and a subset of 86 mammograms with diagnostic evaluation, including 23 (27%) showing cancer. A subset of 13 screening mammograms (two with malignant findings, 11 with diagnostic evaluation) were rereviewed by each observer 2 months later. Kappa statistics were calculated as measures of agreement beyond chance.

Results: After diagnostic evaluation, the interobserver kappa values for describing features were as follows: breast density, 0.43; lesion type, 0.75; mass borders, 0.40; special cases, 0.56; mass density, 0.40; mass shape, 0.28; microcalcification morphology, 0.36; and microcalcification distribution, 0.47. Lesion management was highly variable, with a kappa value for final assessment of 0.37. When we grouped assessments recommending immediate additional evaluation and biopsy (BI-RADS categories 0, 4, and 5 combined) versus follow-up (categories 1, 2, and 3 combined), five observers agreed on management for only 47 (55%) of 86 lesions. Intraobserver agreement on management (additional evaluation or biopsy versus follow-up) was seen in 47 (85%) of 55 interpretations, with a kappa value of 0.35-1.0 (mean, 0.60) for final assessment.

Conclusion: Inter- and intraobserver variability in mammographic interpretation is substantial for both feature analysis and management. Continued development of methods to improve standardization in mammographic interpretation is needed.

PubMed Disclaimer

Publication types

LinkOut - more resources