Can you hear me yet? An intracranial investigation of speech and non-speech audiovisual interactions in human cortex
- PMID: 27182530
- PMCID: PMC4865257
- DOI: 10.1080/23273798.2015.1101145
Can you hear me yet? An intracranial investigation of speech and non-speech audiovisual interactions in human cortex
Abstract
In everyday conversation, viewing a talker's face can provide information about the timing and content of an upcoming speech signal, resulting in improved intelligibility. Using electrocorticography, we tested whether human auditory cortex in Heschl's gyrus (HG) and on superior temporal gyrus (STG) and motor cortex on precentral gyrus (PreC) were responsive to visual/gestural information prior to the onset of sound and whether early stages of auditory processing were sensitive to the visual content (speech syllable versus non-speech motion). Event-related band power (ERBP) in the high gamma band was content-specific prior to acoustic onset on STG and PreC, and ERBP in the beta band differed in all three areas. Following sound onset, we found with no evidence for content-specificity in HG, evidence for visual specificity in PreC, and specificity for both modalities in STG. These results support models of audio-visual processing in which sensory information is integrated in non-primary cortical areas.
Keywords: Cross-modal; Electrocorticography; Multisensory; Speech.
Figures






References
-
- Arnal LH, Giraud AL. Cortical oscillations and sensory predictions. Trends in Cognitive Sciences. 2012;16(7):390–398. doi:10.1016/j.tics.2012.05.003. - PubMed
-
- Arnal LH, Wyart V, Giraud AL. Transitions in neural oscillations reflect prediction errors generated in audiovisual speech. Nature Neuroscience. 2011;14(6):797–801. doi:10.1038/nn.2810. - PubMed
-
- Bates D, Maechler M, Bolker B. lme4: Linear mixed-effects models using S4 classes. R Package Version 1.1-7. 2012 Retrieved from http://lme4.r-forge.r-project.org/
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources