The role of speech production system in audiovisual speech perception
- PMID: 20922046
- PMCID: PMC2948144
- DOI: 10.2174/1874440001004020030
The role of speech production system in audiovisual speech perception
Abstract
Seeing the articulatory gestures of the speaker significantly enhances speech perception. Findings from recent neuroimaging studies suggest that activation of the speech motor system during lipreading enhance speech perception by tuning, in a top-down fashion, speech-sound processing in the superior aspects of the posterior temporal lobe. Anatomically, the superior-posterior temporal lobe areas receive connections from the auditory, visual, and speech motor cortical areas. Thus, it is possible that neuronal receptive fields are shaped during development to respond to speech-sound features that coincide with visual and motor speech cues, in contrast with the anterior/lateral temporal lobe areas that might process speech sounds predominantly based on acoustic cues. The superior-posterior temporal lobe areas have also been consistently associated with auditory spatial processing. Thus, the involvement of these areas in audiovisual speech perception might partly be explained by the spatial processing requirements when associating sounds, seen articulations, and one's own motor movements. Tentatively, it is possible that the anterior "what" and posterior "where / how" auditory cortical processing pathways are parts of an interacting network, the instantaneous state of which determines what one ultimately perceives, as potentially reflected in the dynamics of oscillatory activity.
Keywords: Audiovisual speech perception; electroencephalography.; functional MRI; magnetoencephalography; speech motor theory.
Figures




Similar articles
-
Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.J Neurosci. 2020 Jan 29;40(5):1053-1065. doi: 10.1523/JNEUROSCI.1101-19.2019. Epub 2019 Dec 30. J Neurosci. 2020. PMID: 31889007 Free PMC article.
-
Left Motor δ Oscillations Reflect Asynchrony Detection in Multisensory Speech Perception.J Neurosci. 2022 Mar 16;42(11):2313-2326. doi: 10.1523/JNEUROSCI.2965-20.2022. Epub 2022 Jan 27. J Neurosci. 2022. PMID: 35086905 Free PMC article.
-
Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.Exp Brain Res. 2017 Sep;235(9):2867-2876. doi: 10.1007/s00221-017-5018-0. Epub 2017 Jul 4. Exp Brain Res. 2017. PMID: 28676921
-
Stimulus-dependent activations and attention-related modulations in the auditory cortex: a meta-analysis of fMRI studies.Hear Res. 2014 Jan;307:29-41. doi: 10.1016/j.heares.2013.08.001. Epub 2013 Aug 11. Hear Res. 2014. PMID: 23938208 Review.
-
Prediction and constraint in audiovisual speech perception.Cortex. 2015 Jul;68:169-81. doi: 10.1016/j.cortex.2015.03.006. Epub 2015 Mar 20. Cortex. 2015. PMID: 25890390 Free PMC article. Review.
Cited by
-
Audiovisual integration of speech in a patient with Broca's Aphasia.Front Psychol. 2015 Apr 28;6:435. doi: 10.3389/fpsyg.2015.00435. eCollection 2015. Front Psychol. 2015. PMID: 25972819 Free PMC article.
-
Brain networks of novelty-driven involuntary and cued voluntary auditory attention shifting.PLoS One. 2012;7(8):e44062. doi: 10.1371/journal.pone.0044062. Epub 2012 Aug 28. PLoS One. 2012. PMID: 22937153 Free PMC article.
-
Psychophysics and neuronal bases of sound localization in humans.Hear Res. 2014 Jan;307:86-97. doi: 10.1016/j.heares.2013.07.008. Epub 2013 Jul 22. Hear Res. 2014. PMID: 23886698 Free PMC article. Review.
-
Mental imagery of speech and movement implicates the dynamics of internal forward models.Front Psychol. 2010 Oct 21;1:166. doi: 10.3389/fpsyg.2010.00166. eCollection 2010. Front Psychol. 2010. PMID: 21897822 Free PMC article.
-
Effective cerebral connectivity during silent speech reading revealed by functional magnetic resonance imaging.PLoS One. 2013 Nov 21;8(11):e80265. doi: 10.1371/journal.pone.0080265. eCollection 2013. PLoS One. 2013. PMID: 24278266 Free PMC article.
References
-
- Sumby WH, Pollack I. Visual contribution to speech intelligibility in noise. J Acoust Soc Am. 1954;26:212–15.
-
- Ross LA, Saint-Amour D, Leavitt VM, Javitt DC, Foxe JJ. Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. Cereb Cortex. 2007;17:1147–53. - PubMed
-
- MacLeod A, Summerfield AQ. Quantifying the contribution of vision to speech perception in noise. Br J Audiol. 1987;21:131–41. - PubMed
-
- McGurk H, MacDonald J. Hearing lips and seeing voices. Nature. 1976;264:746–48. - PubMed
LinkOut - more resources
Full Text Sources