Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Jun 12:12:244.
doi: 10.3389/fnhum.2018.00244. eCollection 2018.

Dynamic Facial Expressions Prime the Processing of Emotional Prosody

Affiliations

Dynamic Facial Expressions Prime the Processing of Emotional Prosody

Patricia Garrido-Vásquez et al. Front Hum Neurosci. .

Abstract

Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expressions, which often precede vocally expressed emotion in real life, can modulate event-related potentials (N100 and P200) during emotional prosody processing. To investigate these cross-modal emotional interactions, two lines of research have been put forward: cross-modal integration and cross-modal priming. In cross-modal integration studies, visual and auditory channels are temporally aligned, while in priming studies they are presented consecutively. Here we used cross-modal emotional priming to study the interaction of dynamic visual and auditory emotional information. Specifically, we presented dynamic facial expressions (angry, happy, neutral) as primes and emotionally-intoned pseudo-speech sentences (angry, happy) as targets. We were interested in how prime-target congruency would affect early auditory event-related potentials, i.e., N100 and P200, in order to shed more light on how dynamic facial information is used in cross-modal emotional prediction. Results showed enhanced N100 amplitudes for incongruently primed compared to congruently and neutrally primed emotional prosody, while the latter two conditions did not significantly differ. However, N100 peak latency was significantly delayed in the neutral condition compared to the other two conditions. Source reconstruction revealed that the right parahippocampal gyrus was activated in incongruent compared to congruent trials in the N100 time window. No significant ERP effects were observed in the P200 range. Our results indicate that dynamic facial expressions influence vocal emotion processing at an early point in time, and that an emotional mismatch between a facial expression and its ensuing vocal emotional signal induces additional processing costs in the brain, potentially because the cross-modal emotional prediction mechanism is violated in case of emotional prime-target incongruency.

Keywords: audiovisual; cross-modal prediction; dynamic faces; emotion; event-related potentials; parahippocampal gyrus; priming; prosody.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Still example frames from the video primes, displaying angry (left), happy (middle), or neutral (right) facial expressions, one actor per line.
Figure 2
Figure 2
Event-related potentials averaged over all included electrodes for the three congruency conditions and the incongruent - congruent difference, time-locked to target onset. The time window for N100 analysis is shaded in gray. The scalp potential map shows the incongruent - congruent difference in the N100 time window.
Figure 3
Figure 3
Results from the source reconstruction analysis showing significant clusters in the right parahippocampal gyrus. (A) Incongruent > congruent, z = -12. (B) Anger incongruent > anger congruent, z = -14. Images are thresholded at p < 0.05 family-wise error corrected.

Similar articles

Cited by

References

    1. Aminoff E. M., Kveraga K., Bar M. (2013). The role of the parahippocampal cortex in cognition. Trends Cogn. Sci. 17, 379–390. 10.1016/j.tics.2013.06.009 - DOI - PMC - PubMed
    1. Balconi M., Carrera A. (2011). Cross-modal integration of emotional face and voice in congruous and incongruous pairs: the P2 ERP effect. Eur. J. Cogn. Psychol. 23, 132–139. 10.1080/20445911.2011.473560 - DOI
    1. Baumgartner T., Lutz K., Schmidt C. F., Jäncke L. (2006). The emotional power of music: how music enhances the feeling of affective pictures. Brain Res. 1075, 151–164. 10.1016/j.brainres.2005.12.065 - DOI - PubMed
    1. Bermeitinger C., Frings C., Wentura D. (2008). Reversing the N400: event-related potentials of a negative semantic priming effect. Neuroreport 19, 1479–1482. 10.1097/WNR.0b013e32830f4b0b - DOI - PubMed
    1. Burton L. A., Rabin L., Wyatt G., Frohlich J., Vardy S. B., Dimitri D. (2005). Priming effects for affective vs. neutral faces. Brain Cogn. 59, 322–636 10.1016/j.bandc.2005.05.006 - DOI - PubMed

LinkOut - more resources