Pupil dilation reflects the dynamic integration of audiovisual emotional speech
- PMID: 37016041
- PMCID: PMC10073148
- DOI: 10.1038/s41598-023-32133-2
Pupil dilation reflects the dynamic integration of audiovisual emotional speech
Erratum in
-
Author Correction: Pupil dilation reflects the dynamic integration of audiovisual emotional speech.Sci Rep. 2023 Apr 25;13(1):6766. doi: 10.1038/s41598-023-33845-1. Sci Rep. 2023. PMID: 37185801 Free PMC article. No abstract available.
Abstract
Emotional speech perception is a multisensory process. When speaking with an individual we concurrently integrate the information from their voice and face to decode e.g., their feelings, moods, and emotions. However, the physiological reactions-such as the reflexive dilation of the pupil-associated to these processes remain mostly unknown. That is the aim of the current article, to investigate whether pupillary reactions can index the processes underlying the audiovisual integration of emotional signals. To investigate this question, we used an algorithm able to increase or decrease the smiles seen in a person's face or heard in their voice, while preserving the temporal synchrony between visual and auditory channels. Using this algorithm, we created congruent and incongruent audiovisual smiles, and investigated participants' gaze and pupillary reactions to manipulated stimuli. We found that pupil reactions can reflect emotional information mismatch in audiovisual speech. In our data, when participants were explicitly asked to extract emotional information from stimuli, the first fixation within emotionally mismatching areas (i.e., the mouth) triggered pupil dilation. These results reveal that pupil dilation can reflect the dynamic integration of audiovisual emotional speech and provide insights on how these reactions are triggered during stimulus perception.
© 2023. The Author(s).
Conflict of interest statement
The authors declare no competing interests.
Figures
References
-
- Paulmann S, Pell MD. Is there an advantage for recognizing multi-modal emotional stimuli? Motiv. Emot. 2011;35:192–201. doi: 10.1007/s11031-011-9206-0. - DOI
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
