Integrating face and voice in person perception
- PMID: 17997124
- DOI: 10.1016/j.tics.2007.10.001
Integrating face and voice in person perception
Abstract
Integration of information from face and voice plays a central role in our social interactions. It has been mostly studied in the context of audiovisual speech perception: integration of affective or identity information has received comparatively little scientific attention. Here, we review behavioural and neuroimaging studies of face-voice integration in the context of person perception. Clear evidence for interference between facial and vocal information has been observed during affect recognition or identity processing. Integration effects on cerebral activity are apparent both at the level of heteromodal cortical regions of convergence, particularly bilateral posterior superior temporal sulcus (pSTS), and at 'unimodal' levels of sensory processing. Whether the latter reflects feedback mechanisms or direct crosstalk between auditory and visual cortices is as yet unclear.
Similar articles
-
Hearing facial identities: brain correlates of face--voice integration in person identification.Cortex. 2011 Oct;47(9):1026-37. doi: 10.1016/j.cortex.2010.11.011. Epub 2010 Dec 4. Cortex. 2011. PMID: 21208611
-
Audiovisual integration of emotional signals in voice and face: an event-related fMRI study.Neuroimage. 2007 Oct 1;37(4):1445-56. doi: 10.1016/j.neuroimage.2007.06.020. Epub 2007 Jul 4. Neuroimage. 2007. PMID: 17659885
-
Interaction of face and voice areas during speaker recognition.J Cogn Neurosci. 2005 Mar;17(3):367-76. doi: 10.1162/0898929053279577. J Cogn Neurosci. 2005. PMID: 15813998
-
Thinking the voice: neural correlates of voice perception.Trends Cogn Sci. 2004 Mar;8(3):129-35. doi: 10.1016/j.tics.2004.01.008. Trends Cogn Sci. 2004. PMID: 15301753 Review.
-
Auditory recognition expertise and domain specificity.Brain Res. 2008 Jul 18;1220:191-8. doi: 10.1016/j.brainres.2008.01.014. Epub 2008 Jan 18. Brain Res. 2008. PMID: 18299121 Review.
Cited by
-
Causal Analysis of Activity in Social Brain Areas During Human-Agent Conversation.Front Neuroergon. 2022 May 17;3:843005. doi: 10.3389/fnrgo.2022.843005. eCollection 2022. Front Neuroergon. 2022. PMID: 38235459 Free PMC article.
-
Modeling Emotional Valence Integration From Voice and Touch.Front Psychol. 2018 Oct 12;9:1966. doi: 10.3389/fpsyg.2018.01966. eCollection 2018. Front Psychol. 2018. PMID: 30369901 Free PMC article.
-
Seeing who we hear and hearing who we see.Proc Natl Acad Sci U S A. 2009 Jan 20;106(3):669-70. doi: 10.1073/pnas.0811894106. Epub 2009 Jan 14. Proc Natl Acad Sci U S A. 2009. PMID: 19144916 Free PMC article. No abstract available.
-
How the human brain exchanges information across sensory modalities to recognize other people.Hum Brain Mapp. 2015 Jan;36(1):324-39. doi: 10.1002/hbm.22631. Epub 2014 Sep 13. Hum Brain Mapp. 2015. PMID: 25220190 Free PMC article.
-
Touching lips and hearing fingers: effector-specific congruency between tactile and auditory stimulation modulates N1 amplitude and alpha desynchronization.Exp Brain Res. 2018 Jan;236(1):13-29. doi: 10.1007/s00221-017-5104-3. Epub 2017 Oct 16. Exp Brain Res. 2018. PMID: 29038847 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources