Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Oct;88(4):785-795.
doi: 10.1002/ana.25839. Epub 2020 Aug 7.

Optic Disc Classification by Deep Learning versus Expert Neuro-Ophthalmologists

Collaborators, Affiliations

Optic Disc Classification by Deep Learning versus Expert Neuro-Ophthalmologists

Valérie Biousse et al. Ann Neurol. 2020 Oct.

Abstract

Objective: To compare the diagnostic performance of an artificial intelligence deep learning system with that of expert neuro-ophthalmologists in classifying optic disc appearance.

Methods: The deep learning system was previously trained and validated on 14,341 ocular fundus photographs from 19 international centers. The performance of the system was evaluated on 800 new fundus photographs (400 normal optic discs, 201 papilledema [disc edema from elevated intracranial pressure], 199 other optic disc abnormalities) and compared with that of 2 expert neuro-ophthalmologists who independently reviewed the same randomly presented images without clinical information. Area under the receiver operating characteristic curve, accuracy, sensitivity, and specificity were calculated.

Results: The system correctly classified 678 of 800 (84.7%) photographs, compared with 675 of 800 (84.4%) for Expert 1 and 641 of 800 (80.1%) for Expert 2. The system yielded areas under the receiver operating characteristic curve of 0.97 (95% confidence interval [CI] = 0.96-0.98), 0.96 (95% CI = 0.94-0.97), and 0.89 (95% CI = 0.87-0.92) for the detection of normal discs, papilledema, and other disc abnormalities, respectively. The accuracy, sensitivity, and specificity of the system's classification of optic discs were similar to or better than the 2 experts. Intergrader agreement at the eye level was 0.71 (95% CI = 0.67-0.76) between Expert 1 and Expert 2, 0.72 (95% CI = 0.68-0.76) between the system and Expert 1, and 0.65 (95% CI = 0.61-0.70) between the system and Expert 2.

Interpretation: The performance of this deep learning system at classifying optic disc abnormalities was at least as good as 2 expert neuro-ophthalmologists. Future prospective studies are needed to validate this system as a diagnostic aid in relevant clinical settings. ANN NEUROL 2020;88:785-795.

PubMed Disclaimer

References

    1. Biousse V, Bruce BB, Newman NJ. Ophthalmoscopy in the 21st century: the 2017 H. Houston Merritt lecture. Neurology 2018;90:167-175.
    1. Stunkel L, Newman-Toker DE, Newman NJ, Biousse V. Diagnostic error of neuro-ophthalmologic conditions-state of the science. J Neuroophthalmol (in press).
    1. Bruce BB, Lamirel C, Wright DW, et al. Nonmydriatic ocular fundus photography in the emergency department. N Engl J Med 2011;364:387-389.
    1. Mackay DD, Garza PS, Bruce BB, et al. The demise of direct ophthalmoscopy: a modern clinical challenge. Neurol Clin Pract 2015;5:150-157.
    1. Golombievski E, Doerrler MW, Ruland SD, et al. Frequency of direct funduscopy upon initial encounters for patients with headaches, altered mental status, and visual changes: a pilot study. Front Neurol 2015;6:233.

Publication types

LinkOut - more resources