Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Aug 31;7(1):115-123.
doi: 10.1159/000533188. eCollection 2023 Jan-Dec.

Validation of an Automated Speech Analysis of Cognitive Tasks within a Semiautomated Phone Assessment

Affiliations

Validation of an Automated Speech Analysis of Cognitive Tasks within a Semiautomated Phone Assessment

Daphne Ter Huurne et al. Digit Biomark. .

Abstract

Introduction: We studied the accuracy of the automatic speech recognition (ASR) software by comparing ASR scores with manual scores from a verbal learning test (VLT) and a semantic verbal fluency (SVF) task in a semiautomated phone assessment in a memory clinic population. Furthermore, we examined the differentiating value of these tests between participants with subjective cognitive decline (SCD) and mild cognitive impairment (MCI). We also investigated whether the automatically calculated speech and linguistic features had an additional value compared to the commonly used total scores in a semiautomated phone assessment.

Methods: We included 94 participants from the memory clinic of the Maastricht University Medical Center+ (SCD N = 56 and MCI N = 38). The test leader guided the participant through a semiautomated phone assessment. The VLT and SVF were audio recorded and processed via a mobile application. The recall count and speech and linguistic features were automatically extracted. The diagnostic groups were classified by training machine learning classifiers to differentiate SCD and MCI participants.

Results: The intraclass correlation for inter-rater reliability between the manual and the ASR total word count was 0.89 (95% CI 0.09-0.97) for the VLT immediate recall, 0.94 (95% CI 0.68-0.98) for the VLT delayed recall, and 0.93 (95% CI 0.56-0.97) for the SVF. The full model including the total word count and speech and linguistic features had an area under the curve of 0.81 and 0.77 for the VLT immediate and delayed recall, respectively, and 0.61 for the SVF.

Conclusion: There was a high agreement between the ASR and manual scores, keeping the broad confidence intervals in mind. The phone-based VLT was able to differentiate between SCD and MCI and can have opportunities for clinical trial screening.

Keywords: Automated speech analysis; Fluency; Memory; Mild cognitive impairment; Phone assessment.

PubMed Disclaimer

Conflict of interest statement

J.T., A.K., and N.L. are employed by the company ki elements, which developed the mobile application and calculated the speech and linguistic features. N.L., A.K., and J.T. own shares in the company.

Figures

Fig. 1.
Fig. 1.
Scatterplot of the manually and automatically derived (automatic speech recognition [ASR]) score immediate (a) and delayed (b) verbal learning task (VLT) total word count for the semiautomated phone assessment.
Fig. 2.
Fig. 2.
Scatterplot of the manually and automatically derived (automatic speech recognition [ASR]) score semantic verbal fluency (SVF) task total word count for the semiautomated phone assessment.
Fig. 3.
Fig. 3.
Receiver operating characteristic curve for the verbal learning task (VLT) immediate recall differentiating between subjective cognitive decline (SCD) and mild cognitive impairment (MCI).
Fig. 4.
Fig. 4.
Receiver operating characteristic curve for the verbal learning task (VLT) delayed recall differentiating between subjective cognitive decline (SCD) and mild cognitive impairment (MCI).
Fig. 5.
Fig. 5.
Receiver operating characteristic curve for the semantic verbal fluency (SVF) task differentiating between subjective cognitive decline (SCD) and mild cognitive impairment (MCI).

References

    1. Prado CE, Watt S, Treeby MS, Crowe SF. Performance on neuropsychological assessment and progression to dementia: a meta-analysis. Psychol Aging. 2019 Nov;34(7):954–77. 10.1037/pag0000410. - DOI - PubMed
    1. Lezak MD, Howieson DB, Bigler ED, Tranel D. Neuropsychological assessment. 5th ed.Oxford University Press; 2016.
    1. Schatz P, Browndyke J. Applications of computer-based neuropsychological assessment. J Head Trauma Rehabil. 2002 Oct;17(5):395–410. 10.1097/00001199-200210000-00003. - DOI - PubMed
    1. Bloch A, Maril S, Kavé G. How, when, and for whom: decisions regarding remote neuropsychological assessment during the 2020 COVID-19 pandemic. Isr J Health Policy Res. 2021 May 3;10(1):31. 10.1186/s13584-021-00465-x. - DOI - PMC - PubMed
    1. Geddes MR, O’Connell ME, Fisk JD, Gauthier S, Camicioli R, Ismail Z, et al. Remote cognitive and behavioral assessment: report of the alzheimer society of Canada task force on dementia care best practices for COVID-19. Alzheimers Dement. 2020 Sep 22;12(1):e12111. 10.1002/dad2.12111. - DOI - PMC - PubMed