Transitions in neural oscillations reflect prediction errors generated in audiovisual speech
- PMID: 21552273
- DOI: 10.1038/nn.2810
Transitions in neural oscillations reflect prediction errors generated in audiovisual speech
Abstract
According to the predictive coding theory, top-down predictions are conveyed by backward connections and prediction errors are propagated forward across the cortical hierarchy. Using MEG in humans, we show that violating multisensory predictions causes a fundamental and qualitative change in both the frequency and spatial distribution of cortical activity. When visual speech input correctly predicted auditory speech signals, a slow delta regime (3-4 Hz) developed in higher-order speech areas. In contrast, when auditory signals invalidated predictions inferred from vision, a low-beta (14-15 Hz) / high-gamma (60-80 Hz) coupling regime appeared locally in a multisensory area (area STS). This frequency shift in oscillatory responses scaled with the degree of audio-visual congruence and was accompanied by increased gamma activity in lower sensory regions. These findings are consistent with the notion that bottom-up prediction errors are communicated in predominantly high (gamma) frequency ranges, whereas top-down predictions are mediated by slower (beta) frequencies.
Comment in
-
When what you see is not what you hear.Nat Neurosci. 2011 Jun;14(6):675-6. doi: 10.1038/nn.2843. Nat Neurosci. 2011. PMID: 21613995 No abstract available.
Similar articles
-
Early and late beta-band power reflect audiovisual perception in the McGurk illusion.J Neurophysiol. 2015 Apr 1;113(7):2342-50. doi: 10.1152/jn.00783.2014. Epub 2015 Jan 7. J Neurophysiol. 2015. PMID: 25568160 Free PMC article.
-
Physical and perceptual factors shape the neural mechanisms that integrate audiovisual signals in speech comprehension.J Neurosci. 2011 Aug 3;31(31):11338-50. doi: 10.1523/JNEUROSCI.6510-10.2011. J Neurosci. 2011. PMID: 21813693 Free PMC article.
-
Neural oscillations in the temporal pole for a temporally congruent audio-visual speech detection task.Sci Rep. 2016 Nov 29;6:37973. doi: 10.1038/srep37973. Sci Rep. 2016. PMID: 27897244 Free PMC article.
-
Cortical oscillatory activity and the dynamics of auditory memory processing.Rev Neurosci. 2005;16(3):239-54. doi: 10.1515/revneuro.2005.16.3.239. Rev Neurosci. 2005. PMID: 16323563 Review.
-
Prediction and constraint in audiovisual speech perception.Cortex. 2015 Jul;68:169-81. doi: 10.1016/j.cortex.2015.03.006. Epub 2015 Mar 20. Cortex. 2015. PMID: 25890390 Free PMC article. Review.
Cited by
-
Predicting "When" Using the Motor System's Beta-Band Oscillations.Front Hum Neurosci. 2012 Aug 2;6:225. doi: 10.3389/fnhum.2012.00225. eCollection 2012. Front Hum Neurosci. 2012. PMID: 22876228 Free PMC article. No abstract available.
-
Audio-visual combination of syllables involves time-sensitive dynamics following from fusion failure.Sci Rep. 2020 Oct 22;10(1):18009. doi: 10.1038/s41598-020-75201-7. Sci Rep. 2020. PMID: 33093570 Free PMC article.
-
Hierarchically nested networks optimize the analysis of audiovisual speech.iScience. 2023 Feb 20;26(3):106257. doi: 10.1016/j.isci.2023.106257. eCollection 2023 Mar 17. iScience. 2023. PMID: 36909667 Free PMC article.
-
Explaining flexible continuous speech comprehension from individual motor rhythms.Proc Biol Sci. 2023 Mar 8;290(1994):20222410. doi: 10.1098/rspb.2022.2410. Epub 2023 Mar 1. Proc Biol Sci. 2023. PMID: 36855868 Free PMC article.
-
An orthographic prediction error as the basis for efficient visual word recognition.Neuroimage. 2020 Jul 1;214:116727. doi: 10.1016/j.neuroimage.2020.116727. Epub 2020 Mar 12. Neuroimage. 2020. PMID: 32173410 Free PMC article.
References
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources