Eye-Tracking in Physical Human-Robot Interaction: Mental Workload and Performance Prediction
- PMID: 37793896
- DOI: 10.1177/00187208231204704
Eye-Tracking in Physical Human-Robot Interaction: Mental Workload and Performance Prediction
Abstract
Background: In Physical Human-Robot Interaction (pHRI), the need to learn the robot's motor-control dynamics is associated with increased cognitive load. Eye-tracking metrics can help understand the dynamics of fluctuating mental workload over the course of learning.
Objective: The aim of this study was to test eye-tracking measures' sensitivity and reliability to variations in task difficulty, as well as their performance-prediction capability, in physical human-robot collaboration tasks involving an industrial robot for object comanipulation.
Methods: Participants (9M, 9F) learned to coperform a virtual pick-and-place task with a bimanual robot over multiple trials. Joint stiffness of the robot was manipulated to increase motor-coordination demands. The psychometric properties of eye-tracking measures and their ability to predict performance was investigated.
Results: Stationary Gaze Entropy and pupil diameter were the most reliable and sensitive measures of workload associated with changes in task difficulty and learning. Increased task difficulty was more likely to result in a robot-monitoring strategy. Eye-tracking measures were able to predict the occurrence of success or failure in each trial with 70% sensitivity and 71% accuracy.
Conclusion: The sensitivity and reliability of eye-tracking measures was acceptable, although values were lower than those observed in cognitive domains. Measures of gaze behaviors indicative of visual monitoring strategies were most sensitive to task difficulty manipulations, and should be explored further for the pHRI domain where motor-control and internal-model formation will likely be strong contributors to workload.
Application: Future collaborative robots can adapt to human cognitive state and skill-level measured using eye-tracking measures of workload and visual attention.
Keywords: motor learning; psychometrics; reliability; strategies; virtual environments.
Similar articles
-
An Integrated Electroencephalography and Eye-Tracking Analysis Using eXtreme Gradient Boosting for Mental Workload Evaluation in Surgery.Hum Factors. 2025 May;67(5):464-484. doi: 10.1177/00187208241285513. Epub 2024 Sep 26. Hum Factors. 2025. PMID: 39325959
-
Eye-Tracking Metrics Predict Perceived Workload in Robotic Surgical Skills Training.Hum Factors. 2020 Dec;62(8):1365-1386. doi: 10.1177/0018720819874544. Epub 2019 Sep 27. Hum Factors. 2020. PMID: 31560573 Free PMC article.
-
Prediction of cognitive conflict during unexpected robot behavior under different mental workload conditions in a physical human-robot collaboration.J Neural Eng. 2024 Mar 19;21(2). doi: 10.1088/1741-2552/ad2494. J Neural Eng. 2024. PMID: 38295415
-
Using Eye Tracking for Measuring Cognitive Workload During Clinical Simulations: Literature Review and Synthesis.Comput Inform Nurs. 2021 Apr 22;39(9):499-507. doi: 10.1097/CIN.0000000000000704. Comput Inform Nurs. 2021. PMID: 34495011 Review.
-
Review of Eye Tracking Metrics Involved in Emotional and Cognitive Processes.IEEE Rev Biomed Eng. 2023;16:260-277. doi: 10.1109/RBME.2021.3066072. Epub 2023 Jan 5. IEEE Rev Biomed Eng. 2023. PMID: 33729950 Review.
Cited by
-
Situational Awareness Prediction for Remote Tower Controllers Based on Eye-Tracking and Heart Rate Variability Data.Sensors (Basel). 2025 Mar 25;25(7):2052. doi: 10.3390/s25072052. Sensors (Basel). 2025. PMID: 40218565 Free PMC article.
-
Characterizing eye gaze and mental workload for assistive device control.Wearable Technol. 2025 Mar 3;6:e13. doi: 10.1017/wtc.2024.27. eCollection 2025. Wearable Technol. 2025. PMID: 40071242 Free PMC article.
References
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources