Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Jun;8(2):100430.
doi: 10.1016/j.hjdsi.2020.100430. Epub 2020 May 22.

Methods to identify dementia in the electronic health record: Comparing cognitive test scores with dementia algorithms

Affiliations

Methods to identify dementia in the electronic health record: Comparing cognitive test scores with dementia algorithms

Barbara N Harding et al. Healthc (Amst). 2020 Jun.

Abstract

Background: Epidemiologic studies often use diagnosis codes to identify dementia outcomes. It remains unknown to what extent cognitive screening test results add value in identifying dementia cases in big data studies leveraging electronic health record (EHR) data. We examined test scores from EHR data and compared results with dementia algorithms.

Methods: This retrospective cohort study included patients 60+ years of age from Kaiser Permanente Washington (KPWA) during 2013-2018 and the Veterans Health Affairs (VHA) during 2012-2015. Results from the Mini Mental State Examination (MMSE) and the Saint Louis University Mental Status Examination (SLUMS) cognitive screening exams, were classified as showing dementia or not. Multiple dementia algorithms were created using combinations of diagnosis codes, pharmacy records, and specialty care visits. Correlations between test scores and algorithms were assessed.

Results: 3,690 of 112,917 KPWA patients and 2,981 of 102,981 VHA patients had cognitive test results in the EHR. In KPWA, dementia prevalence ranged from 6.4%-8.1% depending on the algorithm used and in the VHA, 8.9%-12.1%. The algorithm which best agreed with test scores required ≥2 dementia diagnosis codes in 12 months; at KPWA, 14.8% of people meeting this algorithm had an MMSE score, of whom 65% had a score indicating dementia. Within VHA, those figures were 6.2% and 77% respectively.

Conclusions: Although cognitive test results were rarely available, agreement was good with algorithms requiring ≥2 dementia diagnosis codes, supporting the accuracy of this algorithm.

Implications: These scores may add value in identifying dementia cases for EHR-based research studies.

Keywords: Algorithms; Cognitive screening; Dementia; Electronic health record.

PubMed Disclaimer

Conflict of interest statement

Declaration of competing interest Dr. Floyd has consulted for Shionogi Inc. Other authors have no conflicts of interest to disclose.

Figures

Figure 1:
Figure 1:
Nested algorithms in study. This figure shows the three groups of nested, hierarchical algorithms. The necessary criteria for each algorithm are shown along with the number from each population (KPWA or VHA) who met each algorithm. The larger circles indicate the algorithms with less-stringent criteria. As you move toward the center, the algorithm criteria becomes more stringent either by requiring more data or a shorter amount of time.
Figure 2:
Figure 2:
Anchor date and time-window for assessing the correlation between cognitive test results and dementia algorithms. This figure shows the comparison of whether a cognitive test was completed during the time window of interest given that criteria for a dementia algorithm were met. For this comparison, we determine an anchor date that an algorithm was met, then we look back using all available look-back and forward 6 months from this anchor date to determine if a cognitive test was completed.
Figure 3:
Figure 3:
Anchor date and time-window for assessing the correlation between cognitive test results and dementia algorithms. This figure shows the comparison of which algorithms were met during the time window of interest given a cognitive test was completed. For this comparison, we determine an anchor date that a cognitive test (MMSE or SLUMS) was completed, then we look back using all available look-back and look forward 12 months from this anchor date to determine if one or more algorithms were met.
Figure 4:
Figure 4:
Among people with a cognitive test score indicating dementia, proportion meeting algorithms ever before or within 12 months following test. This figure shows how many patients from each population met a particular algorithm based on the presence of a cognitive screening test.

References

    1. Chodosh J, et al., Physician recognition of cognitive impairment: evaluating the need for improvement. J Am Geriatr Soc, 2004. 52(7): p. 1051–9. - PubMed
    1. Plassman BL, et al., Prevalence of dementia in the United States: the aging, demographics, and memory study. Neuroepidemiology, 2007. 29(1–2): p. 125–32. - PMC - PubMed
    1. Querfurth HW and LaFerla FM, Alzheimer’s disease. N Engl J Med, 2010. 362(4): p. 329–44. - PubMed
    1. Matthews KA, et al., Racial and ethnic estimates of Alzheimer’s disease and related dementias in the United States (2015–2060) in adults aged >/=65 years. Alzheimers Dement, 2019. 15(1): p. 17–24. - PMC - PubMed
    1. Reuben DB, et al., An Automated Approach to Identifying Patients with Dementia Using Electronic Medical Records. J Am Geriatr Soc, 2017. 65(3): p. 658–659. - PubMed

MeSH terms