Predicting choice behaviour in economic games using gaze data encoded as scanpath images
- PMID: 36959330
- PMCID: PMC10036613
- DOI: 10.1038/s41598-023-31536-5
Predicting choice behaviour in economic games using gaze data encoded as scanpath images
Abstract
Eye movement data has been extensively utilized by researchers interested in studying decision-making within the strategic setting of economic games. In this paper, we demonstrate that both deep learning and support vector machine classification methods are able to accurately identify participants' decision strategies before they commit to action while playing games. Our approach focuses on creating scanpath images that best capture the dynamics of a participant's gaze behaviour in a way that is meaningful for predictions to the machine learning models. Our results demonstrate a higher classification accuracy by 18% points compared to a baseline logistic regression model, which is traditionally used to analyse gaze data recorded during economic games. In a broader context, we aim to illustrate the potential for eye-tracking data to create information asymmetries in strategic environments in favour of those who collect and process the data. These information asymmetries could become especially relevant as eye-tracking is expected to become more widespread in user applications, with the seemingly imminent mass adoption of virtual reality systems and the development of devices with the ability to record eye movement outside of a laboratory setting.
© 2023. The Author(s).
Conflict of interest statement
The authors declare no competing interests.
Figures




References
-
- Krafka, K. et al. Eye tracking for everyone. arXiv:1606.05814 (2016).
-
- Zhang, X., Sugano, Y., Fritz, M. & Bulling, A. Appearance-based gaze estimation in the wild. CoRRarXiv:1504.02863 (2015). - PubMed
-
- Papoutsaki, A., Laskey, J. & Huang, J. Searchgazer: Webcam eye tracking for remote studies of web search. In Proceedings of the 2017 Conference on Conference Human Information Interaction and Retrieval, CHIIR ’17, 17-26, 10.1145/3020165.3020170 (Association for Computing Machinery, New York, NY, USA, 2017).
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources