Dynamic Facial Expressions Prime the Processing of Emotional Prosody
- PMID: 29946247
- PMCID: PMC6007283
- DOI: 10.3389/fnhum.2018.00244
Dynamic Facial Expressions Prime the Processing of Emotional Prosody
Abstract
Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expressions, which often precede vocally expressed emotion in real life, can modulate event-related potentials (N100 and P200) during emotional prosody processing. To investigate these cross-modal emotional interactions, two lines of research have been put forward: cross-modal integration and cross-modal priming. In cross-modal integration studies, visual and auditory channels are temporally aligned, while in priming studies they are presented consecutively. Here we used cross-modal emotional priming to study the interaction of dynamic visual and auditory emotional information. Specifically, we presented dynamic facial expressions (angry, happy, neutral) as primes and emotionally-intoned pseudo-speech sentences (angry, happy) as targets. We were interested in how prime-target congruency would affect early auditory event-related potentials, i.e., N100 and P200, in order to shed more light on how dynamic facial information is used in cross-modal emotional prediction. Results showed enhanced N100 amplitudes for incongruently primed compared to congruently and neutrally primed emotional prosody, while the latter two conditions did not significantly differ. However, N100 peak latency was significantly delayed in the neutral condition compared to the other two conditions. Source reconstruction revealed that the right parahippocampal gyrus was activated in incongruent compared to congruent trials in the N100 time window. No significant ERP effects were observed in the P200 range. Our results indicate that dynamic facial expressions influence vocal emotion processing at an early point in time, and that an emotional mismatch between a facial expression and its ensuing vocal emotional signal induces additional processing costs in the brain, potentially because the cross-modal emotional prediction mechanism is violated in case of emotional prime-target incongruency.
Keywords: audiovisual; cross-modal prediction; dynamic faces; emotion; event-related potentials; parahippocampal gyrus; priming; prosody.
Figures



Similar articles
-
Abnormal processing of emotional prosody in Williams syndrome: an event-related potentials study.Res Dev Disabil. 2011 Jan-Feb;32(1):133-47. doi: 10.1016/j.ridd.2010.09.011. Epub 2010 Oct 18. Res Dev Disabil. 2011. PMID: 20961731
-
The role of emotion in dynamic audiovisual integration of faces and voices.Soc Cogn Affect Neurosci. 2015 May;10(5):713-20. doi: 10.1093/scan/nsu105. Epub 2014 Aug 20. Soc Cogn Affect Neurosci. 2015. PMID: 25147273 Free PMC article.
-
Prosody Dominates Over Semantics in Emotion Word Processing: Evidence From Cross-Channel and Cross-Modal Stroop Effects.J Speech Lang Hear Res. 2020 Mar 23;63(3):896-912. doi: 10.1044/2020_JSLHR-19-00258. Epub 2020 Mar 18. J Speech Lang Hear Res. 2020. PMID: 32186969
-
The Functional Role of Neural Oscillations in Non-Verbal Emotional Communication.Front Hum Neurosci. 2016 May 25;10:239. doi: 10.3389/fnhum.2016.00239. eCollection 2016. Front Hum Neurosci. 2016. PMID: 27252638 Free PMC article. Review.
-
The Neurocognitive Mechanisms of Unconscious Emotional Responses.2022 Nov 29. In: Boggio PS, Wingenbach TSH, da Silveira Coêlho ML, Comfort WE, Murrins Marques L, Alves MVC, editors. Social and Affective Neuroscience of Everyday Human Interaction: From Theory to Methodology [Internet]. Cham (CH): Springer; 2023. Chapter 2. 2022 Nov 29. In: Boggio PS, Wingenbach TSH, da Silveira Coêlho ML, Comfort WE, Murrins Marques L, Alves MVC, editors. Social and Affective Neuroscience of Everyday Human Interaction: From Theory to Methodology [Internet]. Cham (CH): Springer; 2023. Chapter 2. PMID: 37988514 Free Books & Documents. Review.
Cited by
-
Why Do You Trust News? The Event-Related Potential Evidence of Media Channel and News Type.Front Psychol. 2021 Apr 14;12:663485. doi: 10.3389/fpsyg.2021.663485. eCollection 2021. Front Psychol. 2021. PMID: 33935924 Free PMC article.
-
Effects of temporal properties of facial expressions on the perceived intensity of emotion.R Soc Open Sci. 2023 Jan 11;10(1):220585. doi: 10.1098/rsos.220585. eCollection 2023 Jan. R Soc Open Sci. 2023. PMID: 36686551 Free PMC article.
-
Face and face pareidolia in patients with temporal lobe epilepsy indicates different neural processing: an event-related potential study.Acta Epileptol. 2024 Oct 9;6(1):34. doi: 10.1186/s42494-024-00175-2. Acta Epileptol. 2024. PMID: 40217362 Free PMC article.
-
No intermodal interference effects of threatening information during concurrent audiovisual stimulation.Neuropsychologia. 2020 Jan;136:107283. doi: 10.1016/j.neuropsychologia.2019.107283. Epub 2019 Nov 27. Neuropsychologia. 2020. PMID: 31783079 Free PMC article.
-
You See What You Smell: Preferential Processing of Chemosensory Satiety Cues and Its Impact on Body Shape Perception.Brain Sci. 2021 Aug 30;11(9):1152. doi: 10.3390/brainsci11091152. Brain Sci. 2021. PMID: 34573175 Free PMC article.
References
-
- Balconi M., Carrera A. (2011). Cross-modal integration of emotional face and voice in congruous and incongruous pairs: the P2 ERP effect. Eur. J. Cogn. Psychol. 23, 132–139. 10.1080/20445911.2011.473560 - DOI
LinkOut - more resources
Full Text Sources
Other Literature Sources