Interactions between faces and visual context in emotion perception: A meta-analysis
- PMID: 40180758
- PMCID: PMC12426097
- DOI: 10.3758/s13423-025-02678-6
Interactions between faces and visual context in emotion perception: A meta-analysis
Abstract
Long-standing theories in emotion perception, such as basic emotion theory, argue that we primarily perceive others' emotions through facial expressions. However, compelling evidence shows that other visual contexts, such as body posture or scenes, significantly influence the emotions perceived from faces and vice versa. We used meta-analysis to synthesise and quantify these effects for the first time, testing if faces have primacy over context after accounting for key moderators. Namely, the emotional congruency and clarity of the stimuli. A total of 1,020 effect sizes from 37 articles and 3,198 participants were meta-analysed using three-level mixed-effects models with robust variance estimation. Both visual context and faces were found to have large effects on emotion labelling for the other (gav > 1.23). Effects were larger when visual context and faces signalled different (incongruent) rather than the same (congruent) emotions and congruent effects were moderated by how clearly stimuli signalled the target emotion. When these factors were accounted for, faces were no more influential in altering emotion labelling than body postures or body postures with scenes. The findings of this review clearly evidence the integrative nature of emotion perception. Importantly, however, they also highlight that the influence of different emotion signals depends on how clearly they signal an emotion. Future research needs to account for emotional congruency and signal clarity.
Keywords: Affective integration; Facial primacy; Nonverbal communication.
© 2025. The Author(s).
Conflict of interest statement
Declarations. Ethics approval: Not applicable. Consent to participate: Not applicable. Consent for publication: Not applicable. Conflicts of interest/Competing interests: The authors have no conflicts of interest or competing interests to declare. Open practices statement: The protocol for this systematic review and meta-analysis was preregistered on 2 May 2022. The protocol and datasets generated and analysed during the current study are available in the OSF repository at: https://osf.io/mn7qk/ . Author note: This research is supported by the Australian Government through the Australian Research Council’s Discovery Projects funding scheme (project DP220101026) to AD and RP and by a TRANSFORM Career Development Fellowship to AD from The Australian National University (ANU) College of Health and Medicine. The funders had no role in developing or conducting this research. We have no conflicts of interest to disclose. The protocol for this systematic review and meta-analysis was preregistered on 2 May 2022. The datasets generated during and/or analysed during the current study are available in the OSF repository, https://osf.io/mn7qk/ . We thank Associate Professor Eryn Newman for feedback on earlier drafts of this project, including her suggestion to consider semantic network theory. We thank Jessica Ramamurthy, Leslie Andrews, Maika Kumada, Mila Knezovic, and Wangtianxi Li for help with article coding.
Figures





References
-
- *Abramson, L., Marom, I., Petranker, R., & Aviezer, H. (2017). Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body. Emotion, 17(3), 557–565. - PubMed
-
- Aviezer, H., Ensenberg, N., & Hassin, R. R. (2017). The inherently contextualized nature of facial emotion perception. Current Opinion in Psychology,17, 47–54. - PubMed
-
- Aviezer, H., Hassin, R. R., Ryan, J., Grady, C., Suskind, J., Anderson, A., Moscovitch, M., & Bentin, S. (2008). Angry, disgusted or afraid? Studies on the malleability of emotion perception. Psychological Science,19(7), 724–732. - PubMed
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources