Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Aug 27:2020:2909267.
doi: 10.1155/2020/2909267. eCollection 2020.

Eye-Tracking Analysis for Emotion Recognition

Affiliations

Eye-Tracking Analysis for Emotion Recognition

Paweł Tarnowski et al. Comput Intell Neurosci. .

Abstract

This article reports the results of the study related to emotion recognition by using eye-tracking. Emotions were evoked by presenting a dynamic movie material in the form of 21 video fragments. Eye-tracking signals recorded from 30 participants were used to calculate 18 features associated with eye movements (fixations and saccades) and pupil diameter. To ensure that the features were related to emotions, we investigated the influence of luminance and the dynamics of the presented movies. Three classes of emotions were considered: high arousal and low valence, low arousal and moderate valence, and high arousal and high valence. A maximum of 80% classification accuracy was obtained using the support vector machine (SVM) classifier and leave-one-subject-out validation method.

PubMed Disclaimer

Conflict of interest statement

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Figures

Figure 1
Figure 1
Distribution of emotions caused by movies (numbered from 1 to 21) on the valence-arousal plane along with created classes (marked red).
Figure 2
Figure 2
Typical sequence of the presented movies.
Figure 3
Figure 3
Preprocessing of eye-tracking signal: (a) pure signal, (b) signal after interpolation, and (c) signal after low-pass filtering.
Figure 4
Figure 4
A fragment of an eye-tracking (fixations, saccades, and coordinates of the “sight”).
Figure 5
Figure 5
The distribution of the average pupil diameter and the average saccade amplitude for the three classes of emotions.
Figure 6
Figure 6
Removing the effect of movie luminance: (a) calculated luminance of a movie; (b) blue line, registered pupil diameter for a participant; red line, estimated pupil diameter (as a response for movie luminance); (c) pupil diameter after removing the luminance effect.
Figure 7
Figure 7
Two sample frames from movies with one participant's gaze positions.
Figure 8
Figure 8
Changes in pupil eye diameter of one of the participants after removing the influence of luminance calculated using two methods.
Figure 9
Figure 9
Movie dynamics index.
Figure 10
Figure 10
Dependence of arousal parameter on the movie dynamics.

References

    1. Bal E., Harden E., Lamb D., Van Hecke A. V., Denver J. W., Porges S. W. Emotion recognition in children with autism spectrum disorders: relations to eye gaze and autonomic state. Journal of Autism and Developmental Disorders. 2010;40(3):358–370. doi: 10.1007/s10803-009-0884-3. - DOI - PubMed
    1. Daily S. B., James M. T., Cherry D., et al. Emotions and Affect in Human Factors and Human-Computer Interaction. San Diego, CA, USA: Elsevier Academic Press; 2017. Affective computing: historical foundations, current applications, and future trends; pp. 213–231. - DOI
    1. Karpouzis K., Yannakakis G. N. Emotion in Games. Vol. 4. Cham, Switzerland: Springer International Publishing; 2016.
    1. Holmgård C., Yannakakis G. N., Karstoft K.-I., Andersen H. S. Stress detection for PTSD via the StartleMart game. Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction; September 2013; Geneva, Switzerland. pp. 523–528. - DOI
    1. Pampouchidou A., Simos P. G., Marias K., et al. Automatic assessment of depression based on visual cues: a systematic review. IEEE Transactions on Affective Computing. 2019;10(4):445–470. doi: 10.1109/TAFFC.2017.2724035. - DOI

LinkOut - more resources