Dual neural routing of visual facilitation in speech processing
- PMID: 19864557
- PMCID: PMC6665008
- DOI: 10.1523/JNEUROSCI.3194-09.2009
Dual neural routing of visual facilitation in speech processing
Abstract
Viewing our interlocutor facilitates speech perception, unlike for instance when we telephone. Several neural routes and mechanisms could account for this phenomenon. Using magnetoencephalography, we show that when seeing the interlocutor, latencies of auditory responses (M100) are the shorter the more predictable speech is from visual input, whether the auditory signal was congruent or not. Incongruence of auditory and visual input affected auditory responses approximately 20 ms after latency shortening was detected, indicating that initial content-dependent auditory facilitation by vision is followed by a feedback signal that reflects the error between expected and received auditory input (prediction error). We then used functional magnetic resonance imaging and confirmed that distinct routes of visual information to auditory processing underlie these two functional mechanisms. Functional connectivity between visual motion and auditory areas depended on the degree of visual predictability, whereas connectivity between the superior temporal sulcus and both auditory and visual motion areas was driven by audiovisual (AV) incongruence. These results establish two distinct mechanisms by which the brain uses potentially predictive visual information to improve auditory perception. A fast direct corticocortical pathway conveys visual motion parameters to auditory cortex, and a slower and indirect feedback pathway signals the error between visual prediction and auditory input.
Figures





Similar articles
-
Time course of early audiovisual interactions during speech and nonspeech central auditory processing: a magnetoencephalography study.J Cogn Neurosci. 2009 Feb;21(2):259-74. doi: 10.1162/jocn.2008.21019. J Cogn Neurosci. 2009. PMID: 18510440
-
Natural, metaphoric, and linguistic auditory direction signals have distinct influences on visual motion processing.J Neurosci. 2009 May 20;29(20):6490-9. doi: 10.1523/JNEUROSCI.5437-08.2009. J Neurosci. 2009. PMID: 19458220 Free PMC article.
-
Sequential audiovisual interactions during speech perception: a whole-head MEG study.Neuropsychologia. 2007 Mar 25;45(6):1342-54. doi: 10.1016/j.neuropsychologia.2006.09.019. Epub 2006 Oct 25. Neuropsychologia. 2007. PMID: 17067640
-
Human cortical areas underlying the perception of optic flow: brain imaging studies.Int Rev Neurobiol. 2000;44:269-92. doi: 10.1016/s0074-7742(08)60746-1. Int Rev Neurobiol. 2000. PMID: 10605650 Review.
-
Prediction and constraint in audiovisual speech perception.Cortex. 2015 Jul;68:169-81. doi: 10.1016/j.cortex.2015.03.006. Epub 2015 Mar 20. Cortex. 2015. PMID: 25890390 Free PMC article. Review.
Cited by
-
Vision perceptually restores auditory spectral dynamics in speech.Proc Natl Acad Sci U S A. 2020 Jul 21;117(29):16920-16927. doi: 10.1073/pnas.2002887117. Epub 2020 Jul 6. Proc Natl Acad Sci U S A. 2020. PMID: 32632010 Free PMC article.
-
Eye Can Hear Clearly Now: Inverse Effectiveness in Natural Audiovisual Speech Processing Relies on Long-Term Crossmodal Temporal Integration.J Neurosci. 2016 Sep 21;36(38):9888-95. doi: 10.1523/JNEUROSCI.1396-16.2016. J Neurosci. 2016. PMID: 27656026 Free PMC article.
-
On the role of crossmodal prediction in audiovisual emotion perception.Front Hum Neurosci. 2013 Jul 18;7:369. doi: 10.3389/fnhum.2013.00369. eCollection 2013. Front Hum Neurosci. 2013. PMID: 23882204 Free PMC article.
-
Assessing the effect of physical differences in the articulation of consonants and vowels on audiovisual temporal perception.Front Integr Neurosci. 2012 Oct 1;6:71. doi: 10.3389/fnint.2012.00071. eCollection 2012. Front Integr Neurosci. 2012. PMID: 23060756 Free PMC article.
-
Audiovisual speech integration does not rely on the motor system: evidence from articulatory suppression, the McGurk effect, and fMRI.J Cogn Neurosci. 2014 Mar;26(3):606-20. doi: 10.1162/jocn_a_00515. Epub 2013 Nov 18. J Cogn Neurosci. 2014. PMID: 24236768 Free PMC article.
References
-
- Barraclough NE, Xiao D, Baker CI, Oram MW, Perrett DI. Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions. J Cogn Neurosci. 2005;17:377–391. - PubMed
-
- Beauchamp MS, Lee KE, Argall BD, Martin A. Integration of auditory and visual information about objects in superior temporal sulcus. Neuron. 2004a;41:809–823. - PubMed
-
- Beauchamp MS, Argall BD, Bodurka J, Duyn JH, Martin A. Unraveling multisensory integration: patchy organization within human STS multisensory cortex. Nat Neurosci. 2004b;7:1190–1192. - PubMed
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources