Can deepfakes be used to study emotion perception? A comparison of dynamic face stimuli
- PMID: 38834812
- PMCID: PMC11362322
- DOI: 10.3758/s13428-024-02443-y
Can deepfakes be used to study emotion perception? A comparison of dynamic face stimuli
Abstract
Video recordings accurately capture facial expression movements; however, they are difficult for face perception researchers to standardise and manipulate. For this reason, dynamic morphs of photographs are often used, despite their lack of naturalistic facial motion. This study aimed to investigate how humans perceive emotions from faces using real videos and two different approaches to artificially generating dynamic expressions - dynamic morphs, and AI-synthesised deepfakes. Our participants perceived dynamic morphed expressions as less intense when compared with videos (all emotions) and deepfakes (fearful, happy, sad). Videos and deepfakes were perceived similarly. Additionally, they perceived morphed happiness and sadness, but not morphed anger or fear, as less genuine than other formats. Our findings support previous research indicating that social responses to morphed emotions are not representative of those to video recordings. The findings also suggest that deepfakes may offer a more suitable standardized stimulus type compared to morphs. Additionally, qualitative data were collected from participants and analysed using ChatGPT, a large language model. ChatGPT successfully identified themes in the data consistent with those identified by an independent human researcher. According to this analysis, our participants perceived dynamic morphs as less natural compared with videos and deepfakes. That participants perceived deepfakes and videos similarly suggests that deepfakes effectively replicate natural facial movements, making them a promising alternative for face perception research. The study contributes to the growing body of research exploring the usefulness of generative artificial intelligence for advancing the study of human perception.
Keywords: Deepfake; Dynamic faces; Emotion; Face perception; Generative AI.
© 2024. The Author(s).
Conflict of interest statement
We have no relevant financial or non-financial conflicts of interest to disclose.
Figures







Similar articles
-
EEG correlates of static and dynamic face perception: The role of naturalistic motion.Neuropsychologia. 2024 Dec 15;205:108986. doi: 10.1016/j.neuropsychologia.2024.108986. Epub 2024 Aug 31. Neuropsychologia. 2024. PMID: 39218391
-
The role of temporal inversion in the perception of realistic and morphed dynamic transitions between facial expressions.Vision Res. 2018 Feb;143:42-51. doi: 10.1016/j.visres.2017.10.007. Epub 2017 Dec 27. Vision Res. 2018. PMID: 29274357
-
Do Deepfakes Adequately Display Emotions? A Study on Deepfake Facial Emotion Expression.Comput Intell Neurosci. 2022 Oct 18;2022:1332122. doi: 10.1155/2022/1332122. eCollection 2022. Comput Intell Neurosci. 2022. PMID: 36304741 Free PMC article.
-
Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements.Psychol Sci Public Interest. 2019 Jul;20(1):1-68. doi: 10.1177/1529100619832930. Psychol Sci Public Interest. 2019. PMID: 31313636 Free PMC article. Review.
-
Can deepfakes manipulate us? Assessing the evidence via a critical scoping review.PLoS One. 2025 May 2;20(5):e0320124. doi: 10.1371/journal.pone.0320124. eCollection 2025. PLoS One. 2025. PMID: 40315197 Free PMC article.
References
-
- Abrosoft. (2011). FantaMorph.
-
- Agarwal, S., & Farid, H. (2021). Detecting Deep-Fake Videos From Aural and Oral Dynamics. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 10.1109/CVPRW53098.2021.0010910.1109/CVPRW53098.2021.00109 - DOI
-
- Ajoy, A., Mahindrakar, C. U., Gowrish, D., & V, A. (2021). Deepfake detection using a frame-based approach involving CNN. 2021 Third International Conference on Inventive Research in Computing Applications (ICIRCA), 10.1109/ICIRCA51532.2021.9544734
-
- Appel, M., & Prietzel, F. (2022). The detection of political deepfakes. Journal of Computer-Mediated Communication, 27(4), zmac008. 10.1093/jcmc/zmac008
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources