Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Oct;101(10):639-641.
doi: 10.1016/j.diii.2020.09.001. Epub 2020 Sep 18.

Interobserver agreement issues in radiology

Affiliations
Free article

Interobserver agreement issues in radiology

M Benchoufi et al. Diagn Interv Imaging. 2020 Oct.
Free article

Abstract

Agreement between observers (i.e., inter-rater agreement) can be quantified with various criteria but their appropriate selections are critical. When the measure is qualitative (nominal or ordinal), the proportion of agreement or the kappa coefficient should be used to evaluate inter-rater consistency (i.e., inter-rater reliability). The kappa coefficient is more meaningful that the raw percentage of agreement, because the latter does not account for agreements due to chance alone. When the measures are quantitative, the intraclass correlation coefficient (ICC) should be used to assess agreement but this should be done with care because there are different ICCs so that it is important to describe the model and type of ICC being used. The Bland-Altman method can be used to assess consistency and conformity but its use should be restricted to comparison of two raters.

Keywords: Interobserver agreement; Intraclass correlation coefficient; Kappa test; Radiology; Reproducibility of results.

PubMed Disclaimer

LinkOut - more resources