Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Comparative Study
. 2011 Apr;118(4):742-6.
doi: 10.1016/j.ophtha.2010.08.019. Epub 2010 Nov 4.

Agreement and accuracy of non-expert ophthalmologists in assessing glaucomatous changes in serial stereo optic disc photographs

Affiliations
Comparative Study

Agreement and accuracy of non-expert ophthalmologists in assessing glaucomatous changes in serial stereo optic disc photographs

Christophe Breusegem et al. Ophthalmology. 2011 Apr.

Abstract

Purpose: To compare the interobserver agreement in detecting glaucomatous optic disc changes using serial stereophotography between a large group of non-expert ophthalmologists and glaucoma specialists; to assess the accuracy of non-experts; to investigate whether the interobserver agreement and the accuracy of non-experts changed after a training session.

Design: Masked interobserver agreement study.

Participants: Serial optic disc stereophotos from 40 patients with glaucoma.

Methods: Three independent experienced glaucoma specialists (readers of the European Glaucoma Prevention Study) evaluated a set of 2 serial optic disc color stereo-slides for glaucomatous change, obtained with a delay varying from 2 to 7 years of 40 patients, masked from the temporal sequence of the slides. Each patient was graded as changed or stable by agreement of 2 of 3 of the experts (the reference standard). Thirty-seven non-expert ophthalmologists independently evaluated the same set of serial optic disc stereo-slides twice, with the second evaluation on the same day, masked from the results of the previous evaluation, after a training session on a separate slide set.

Main outcome measures: Interobserver agreement of non-experts and experts in detecting glaucomatous optic disc changes (expressed as kappa coefficient); agreement of non-experts with the reference standard (accuracy) before and after a training session.

Results: The interobserver kappa coefficient (κ) of the non-experts and experts was 0.20 (95% confidence interval [CI], 0.19-0.21) and 0.51 (95% CI, 0.33-0.69), respectively (P<0.0001). The mean κ of the non-experts with the reference standard was 0.33 (95% CI, 0.27-0.39). After a training session, the interobserver agreement of the non-experts increased from 0.20 to 0.27 (95% CI, 0.26-0.28) (P<0.0001). The percentage agreement of the non-experts with the reference standard improved from 68.5% before to 71.4% after the training session (Wilcoxon signed-rank test, P=0.034).

Conclusions: The interobserver agreement of non-expert ophthalmologists in detecting glaucomatous optic disc changes using serial stereophotos was significantly lower than that of experts, which was moderate. After a training session, the interobserver agreement and the accuracy of the non-experts showed a small but statistically significant improvement.

PubMed Disclaimer

Publication types

MeSH terms

LinkOut - more resources