Investigating Methods for Cognitive Workload Estimation for Assistive Robots
- PMID: 36146189
- PMCID: PMC9505485
- DOI: 10.3390/s22186834
Investigating Methods for Cognitive Workload Estimation for Assistive Robots
Abstract
Robots interacting with humans in assistive contexts have to be sensitive to human cognitive states to be able to provide help when it is needed and not overburden the human when the human is busy. Yet, it is currently still unclear which sensing modality might allow robots to derive the best evidence of human workload. In this work, we analyzed and modeled data from a multi-modal simulated driving study specifically designed to evaluate different levels of cognitive workload induced by various secondary tasks such as dialogue interactions and braking events in addition to the primary driving task. Specifically, we performed statistical analyses of various physiological signals including eye gaze, electroencephalography, and arterial blood pressure from the healthy volunteers and utilized several machine learning methodologies including k-nearest neighbor, naive Bayes, random forest, support-vector machines, and neural network-based models to infer human cognitive workload levels. Our analyses provide evidence for eye gaze being the best physiological indicator of human cognitive workload, even when multiple signals are combined. Specifically, the highest accuracy (in %) of binary workload classification based on eye gaze signals is 80.45 ∓ 3.15 achieved by using support-vector machines, while the highest accuracy combining eye gaze and electroencephalography is only 77.08 ∓ 3.22 achieved by a neural network-based model. Our findings are important for future efforts of real-time workload estimation in the multimodal human-robot interactive systems given that eye gaze is easy to collect and process and less susceptible to noise artifacts compared to other physiological signal modalities.
Keywords: EEG; assistive robots; autonomous interactive systems; cognitive workload classification; eye gaze; multi-modality learning; pupillometry.
Conflict of interest statement
The authors declare no conflict of interest.
Figures






Similar articles
-
Measurement and identification of mental workload during simulated computer tasks with multimodal methods and machine learning.Ergonomics. 2020 Jul;63(7):896-908. doi: 10.1080/00140139.2020.1759699. Epub 2020 May 7. Ergonomics. 2020. PMID: 32330080
-
Prediction of cognitive conflict during unexpected robot behavior under different mental workload conditions in a physical human-robot collaboration.J Neural Eng. 2024 Mar 19;21(2). doi: 10.1088/1741-2552/ad2494. J Neural Eng. 2024. PMID: 38295415
-
Eye-Tracking in Physical Human-Robot Interaction: Mental Workload and Performance Prediction.Hum Factors. 2024 Aug;66(8):2104-2119. doi: 10.1177/00187208231204704. Epub 2023 Oct 4. Hum Factors. 2024. PMID: 37793896
-
Multimodal Fusion for Objective Assessment of Cognitive Workload: A Review.IEEE Trans Cybern. 2021 Mar;51(3):1542-1555. doi: 10.1109/TCYB.2019.2939399. Epub 2021 Feb 17. IEEE Trans Cybern. 2021. PMID: 31545761 Review.
-
Review of Eye Tracking Metrics Involved in Emotional and Cognitive Processes.IEEE Rev Biomed Eng. 2023;16:260-277. doi: 10.1109/RBME.2021.3066072. Epub 2023 Jan 5. IEEE Rev Biomed Eng. 2023. PMID: 33729950 Review.
Cited by
-
Assistive Robots for Healthcare and Human-Robot Interaction.Sensors (Basel). 2023 Feb 8;23(4):1883. doi: 10.3390/s23041883. Sensors (Basel). 2023. PMID: 36850481 Free PMC article.
-
Human-Centric Cognitive State Recognition Using Physiological Signals: A Systematic Review of Machine Learning Strategies Across Application Domains.Sensors (Basel). 2025 Jul 5;25(13):4207. doi: 10.3390/s25134207. Sensors (Basel). 2025. PMID: 40648460 Free PMC article.
-
Quantifying Cognitive Workload Using a Non-Contact Magnetocardiography (MCG) Wearable Sensor.Sensors (Basel). 2022 Nov 24;22(23):9115. doi: 10.3390/s22239115. Sensors (Basel). 2022. PMID: 36501816 Free PMC article.
References
-
- Heard J., Harriott C.E., Adams J.A. A survey of workload assessment algorithms. IEEE Trans. Hum.-Mach. Syst. 2018;48:434–451. doi: 10.1109/THMS.2017.2782483. - DOI
-
- Berka C., Levendowski D.J., Lumicao M.N., Yau A., Davis G., Zivkovic V.T., Olmstead R.E., Tremoulet P.D., Craven P.L. EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. Aviat. Space Environ. Med. 2007;78:B231–B244. - PubMed
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources