Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2003 Nov 4;3(1):20.
doi: 10.1186/1472-6963-3-20.

Inter-rater reliability of nursing home quality indicators in the U.S

Affiliations

Inter-rater reliability of nursing home quality indicators in the U.S

Vincent Mor et al. BMC Health Serv Res. .

Abstract

Background: In the US, Quality Indicators (QI's) profiling and comparing the performance of hospitals, health plans, nursing homes and physicians are routinely published for consumer review. We report the results of the largest study of inter-rater reliability done on nursing home assessments which generate the data used to derive publicly reported nursing home quality indicators.

Methods: We sampled nursing homes in 6 states, selecting up to 30 residents per facility who were observed and assessed by research nurses on 100 clinical assessment elements contained in the Minimum Data Set (MDS) and compared these with the most recent assessment in the record done by facility nurses. Kappa statistics were generated for all data items and derived for 22 QI's over the entire sample and for each facility. Finally, facilities with many QI's with poor Kappa levels were compared to those with many QI's with excellent Kappa levels on selected characteristics.

Results: A total of 462 facilities in 6 states were approached and 219 agreed to participate, yielding a response rate of 47.4%. A total of 5758 residents were included in the inter-rater reliability analyses, around 27.5 per facility. Patients resembled the traditional nursing home resident, only 43.9% were continent of urine and only 25.2% were rated as likely to be discharged within the next 30 days. Results of resident level comparative analyses reveal high inter-rater reliability levels (most items >.75). Using the research nurses as the "gold standard", we compared composite quality indicators based on their ratings with those based on facility nurses. All but two QI's have adequate Kappa levels and 4 QI's have average Kappa values in excess of.80. We found that 16% of participating facilities performed poorly (Kappa <.4) on more than 6 of the 22 QI's while 18% of facilities performed well (Kappa >.75) on 12 or more QI's. No facility characteristics were related to reliability of the data on which Qis are based.

Conclusion: While a few QI's being used for public reporting have limited reliability as measured in US nursing homes today, the vast majority of QI's are measured reliably across the majority of nursing facilities. Although information about the average facility is reliable, how the public can identify those facilities whose data can be trusted and whose cannot remains a challenge.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Facility Kappa Values Comparing "Gold Standard" Raters withFacility Nurses: Incontinence Quality Indicator The distribution of Kappa values averaged for all residents in each facility reflecting the inter-rater reliability of the "gold standard" nurses and facility nurses on the Incontinence quality indicator. The "Y" axis indicates the number of facilities and the "X" axis the facility inter-rater reliability level calculated for the Incontinence QI.
Figure 2
Figure 2
Facility Kappa Values Comparing "Gold Standard" Raters with Facility Nurses: Inadequate Pain Management Quality Indicator The distribution of Kappa values averaged for all residents in each facility reflecting the inter-rater reliability of the "gold standard" nurses and facility nurses on the Pain Management quality indicator. The "Y" axis indicates the number of facilities and the "X" axis the facility inter-rater reliability level calculated for the Pain Management QI.
Figure 3
Figure 3
Scatter plot of the number of HIGH QI Kappa Values and the number of LOW QI Kappa Values per Facility The number of facilities with QI Kappa values (out of 22 QI's) of .75 or better (HI) is plotted against the number of facilities with QI Kappa values below .4 (LO). A count of the number of HI QI's and the number of LO QI's was generated for each facility and the resulting relationship plotted.

References

    1. Epstein AM. Rolling down the runway: the challenges ahead for quality report cards. Jama. 1998;279:1691–1696. doi: 10.1001/jama.279.21.1691. - DOI - PubMed
    1. Jencks SF. The government's role in hospital accountability for quality of care. Jt Comm J Qual Improv. 1994;20:364–369. - PubMed
    1. Fernandopulle R, Ferris T, Epstein A, McNeil B, Newhouse J, Pisano G, Blumenthal D. A research agenda for bridging the 'quality chasm.'. Health Aff (Millwood) 2003;22:178–190. doi: 10.1377/hlthaff.22.2.178. - DOI - PubMed
    1. Jencks SF. Medicare analysis and use of outcome-based data. Ann Thorac Surg. 1996;62:S12–3; discussion S31-2. doi: 10.1016/0003-4975(96)00820-X. - DOI - PubMed
    1. Jencks SF, Huff ED, Cuerdon T. Change in the quality of care delivered to Medicare beneficiaries, 1998-1999 to 2000-2001. Jama. 2003;289:305–312. doi: 10.1001/jama.289.3.305. - DOI - PubMed

Publication types