Idiosyncratic fixation patterns generalize across dynamic and static facial expression recognition
- PMID: 39003314
- PMCID: PMC11246522
- DOI: 10.1038/s41598-024-66619-4
Idiosyncratic fixation patterns generalize across dynamic and static facial expression recognition
Abstract
Facial expression recognition (FER) is crucial for understanding the emotional state of others during human social interactions. It has been assumed that humans share universal visual sampling strategies to achieve this task. However, recent studies in face identification have revealed striking idiosyncratic fixation patterns, questioning the universality of face processing. More importantly, very little is known about whether such idiosyncrasies extend to the biological relevant recognition of static and dynamic facial expressions of emotion (FEEs). To clarify this issue, we tracked observers' eye movements categorizing static and ecologically valid dynamic faces displaying the six basic FEEs, all normalized for time presentation (1 s), contrast and global luminance across exposure time. We then used robust data-driven analyses combining statistical fixation maps with hidden Markov Models to explore eye-movements across FEEs and stimulus modalities. Our data revealed three spatially and temporally distinct equally occurring face scanning strategies during FER. Crucially, such visual sampling strategies were mostly comparably effective in FER and highly consistent across FEEs and modalities. Our findings show that spatiotemporal idiosyncratic gaze strategies also occur for the biologically relevant recognition of FEEs, further questioning the universality of FER and, more generally, face processing.
Keywords: Individual differences – facial expressions of emotion – eye-movements.
© 2024. The Author(s).
Conflict of interest statement
The authors declare no competing interests.
Figures








Similar articles
-
Gaze patterns in viewing static and dynamic body expressions.Acta Psychol (Amst). 2019 Jul;198:102862. doi: 10.1016/j.actpsy.2019.05.014. Epub 2019 Jun 18. Acta Psychol (Amst). 2019. PMID: 31226535
-
Neural Representations of Faces Are Tuned to Eye Movements.J Neurosci. 2019 May 22;39(21):4113-4123. doi: 10.1523/JNEUROSCI.2968-18.2019. Epub 2019 Mar 13. J Neurosci. 2019. PMID: 30867260 Free PMC article.
-
Eye movements during emotion recognition in faces.J Vis. 2014 Nov 18;14(13):14. doi: 10.1167/14.13.14. J Vis. 2014. PMID: 25406159
-
A Review on Automatic Facial Expression Recognition Systems Assisted by Multimodal Sensor Data.Sensors (Basel). 2019 Apr 18;19(8):1863. doi: 10.3390/s19081863. Sensors (Basel). 2019. PMID: 31003522 Free PMC article. Review.
-
Measuring emotion recognition by people with Parkinson's disease using eye-tracking with dynamic facial expressions.J Neurosci Methods. 2020 Feb 1;331:108524. doi: 10.1016/j.jneumeth.2019.108524. Epub 2019 Nov 17. J Neurosci Methods. 2020. PMID: 31747554 Review.
Cited by
-
Exploring the Role of Foveal and Extrafoveal Processing in Emotion Recognition: A Gaze-Contingent Study.Behav Sci (Basel). 2025 Jan 26;15(2):135. doi: 10.3390/bs15020135. Behav Sci (Basel). 2025. PMID: 40001766 Free PMC article.
-
Distinct contributions of foveal and extrafoveal visual information to emotion judgments and gaze behavior for faces.J Vis. 2025 Jul 1;25(8):4. doi: 10.1167/jov.25.8.4. J Vis. 2025. PMID: 40600755 Free PMC article.
References
-
- Ekman P, Friesen WV. Unmasking the Face: A Guide to Recognizing Emotions from Facial Clues. Ishk; 1975.
-
- Izard, C. E. The face of Emotion (Appleton-Century-Crofts, 1971).
-
- Darwin C. The Expression of the Emotions in Man and Animals. John Murray; 1872.
-
- Yitzhak N, Pertzov Y, Aviezer H. The elusive link between eye-movement patterns and facial expression recognition. Soc. Personal Psychol. Compass. 2021 doi: 10.1111/spc3.12621. - DOI
-
- White D, Burton AM. Individual differences and the multidimensional nature of face perception. Nat. Rev. Psychol. 2022;1:287. doi: 10.1038/s44159-022-00041-3. - DOI
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources