Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 1985 Jul;42(7):725-8.
doi: 10.1001/archpsyc.1985.01790300093012.

A proposed solution to the base rate problem in the kappa statistic

A proposed solution to the base rate problem in the kappa statistic

E L Spitznagel et al. Arch Gen Psychiatry. 1985 Jul.

Abstract

Because it corrects for chance agreement, kappa (kappa) is a useful statistic for calculating interrater concordance. However, kappa has been criticized because its computed value is a function not only of sensitivity and specificity, but also the prevalence, or base rate, of the illness of interest in the particular population under study. For example, it has been shown for a hypothetical case in which sensitivity and specificity remain constant at .95 each, that kappa falls from .81 to .14 when the prevalence drops from 50% to 1%. Thus, differing values of kappa may be entirely due to differences in prevalence. Calculation of agreement presents different problems depending on whether one is studying reliability or validity. We discuss quantification of agreement in the pure validity case, the pure reliability case, and those studies that fall somewhere between. As a way of minimizing the base rate problem, we propose a statistic for the quantification of agreement (the Y statistic), which can be related to kappa but which is completely independent of prevalence in the case of validity studies and relatively so in the case of reliability.

PubMed Disclaimer