Sex differences in perception of emotion intensity in dynamic and static facial expressions
- PMID: 16628369
- DOI: 10.1007/s00221-005-0254-0
Sex differences in perception of emotion intensity in dynamic and static facial expressions
Abstract
Most research on the perception of emotional expressions is conducted using static faces as stimuli. However, facial displays of emotion are a highly dynamic phenomenon and a static photograph is its very unnatural representation. The goal of the present research was to assess the role of stimuli dynamics as well as subjects' sex in the perception of emotional expressions. In the experiment, subjects rated the intensity of expressions of anger and happiness presented as photographs (static stimuli) and animations (dynamic stimuli). The impact of both stimulus dynamics and emotion type on the perceived intensity was observed. The emotions on 'angry faces' were judged as more intense than on 'happy faces' and the intensity ratings were higher in the case of animation rather than photography. Moreover, gender differences in the rated intensity were found. For male subjects higher intensity ratings for dynamic than for static expressions were noted in the case of anger, whereas in the case of happiness, no differences were observed. For female subjects, however, differences for both anger and happiness were significant. The results suggest that the dynamic characteristic of facial display is an important factor in the perception of the intensity of emotional expressions. Its effect, however, depends on the subjects' sex and emotional valence.
Similar articles
-
Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry?PLoS One. 2016 Jul 8;11(7):e0158534. doi: 10.1371/journal.pone.0158534. eCollection 2016. PLoS One. 2016. PMID: 27390867 Free PMC article. Clinical Trial.
-
Emotion perception from dynamic and static body expressions in point-light and full-light displays.Perception. 2004;33(6):717-46. doi: 10.1068/p5096. Perception. 2004. PMID: 15330366
-
Effects of affective and emotional congruency on facial expression processing under different task demands.Acta Psychol (Amst). 2018 Jun;187:66-76. doi: 10.1016/j.actpsy.2018.04.013. Epub 2018 May 8. Acta Psychol (Amst). 2018. PMID: 29751931
-
Human and machine recognition of dynamic and static facial expressions: prototypicality, ambiguity, and complexity.Front Psychol. 2023 Sep 12;14:1221081. doi: 10.3389/fpsyg.2023.1221081. eCollection 2023. Front Psychol. 2023. PMID: 37794914 Free PMC article. Review.
-
Representations of facial expressions since Darwin.Evol Hum Sci. 2022 Apr 28;4:e22. doi: 10.1017/ehs.2022.10. eCollection 2022. Evol Hum Sci. 2022. PMID: 37588914 Free PMC article. Review.
Cited by
-
The perception of dynamic and static facial expressions of happiness and disgust investigated by ERPs and fMRI constrained source analysis.PLoS One. 2013 Jun 20;8(6):e66997. doi: 10.1371/journal.pone.0066997. Print 2013. PLoS One. 2013. PMID: 23818974 Free PMC article. Clinical Trial.
-
Children Facial Expression Production: Influence of Age, Gender, Emotion Subtype, Elicitation Condition and Culture.Front Psychol. 2018 Apr 4;9:446. doi: 10.3389/fpsyg.2018.00446. eCollection 2018. Front Psychol. 2018. PMID: 29670561 Free PMC article.
-
Selecting pure-emotion materials from the International Affective Picture System (IAPS) by Chinese university students: A study based on intensity-ratings only.Heliyon. 2017 Aug 30;3(8):e00389. doi: 10.1016/j.heliyon.2017.e00389. eCollection 2017 Aug. Heliyon. 2017. PMID: 28920091 Free PMC article.
-
The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English.PLoS One. 2018 May 16;13(5):e0196391. doi: 10.1371/journal.pone.0196391. eCollection 2018. PLoS One. 2018. PMID: 29768426 Free PMC article.
-
Relationships among facial mimicry, emotional experience, and emotion recognition.PLoS One. 2013;8(3):e57889. doi: 10.1371/journal.pone.0057889. Epub 2013 Mar 25. PLoS One. 2013. PMID: 23536774 Free PMC article.
References
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources