The impact of when, what and how predictions on auditory speech perception
- PMID: 31576421
- DOI: 10.1007/s00221-019-05661-5
The impact of when, what and how predictions on auditory speech perception
Abstract
An impressive number of theoretical proposals and neurobiological studies argue that perceptual processing is not strictly feedforward but rather operates through an interplay between bottom-up sensory and top-down predictive mechanisms. The present EEG study aimed to further determine how prior knowledge on auditory syllables may impact speech perception. Prior knowledge was manipulated by presenting the participants with visual information indicative of the syllable onset (when), its phonetic content (what) and/or its articulatory features (how). While when and what predictions consisted of unnatural visual cues (i.e., a visual timeline and a visuo-orthographic cue), how prediction consisted of the visual movements of a speaker. During auditory speech perception, when and what predictions both attenuated the amplitude of N1/P2 auditory evoked potentials. Regarding how prediction, not only an amplitude decrease but also a latency facilitation of N1/P2 auditory evoked potentials were observed during audiovisual compared to unimodal speech perception. However, when and what predictability effects were then reduced or abolished, with only what prediction reducing P2 amplitude but increasing latency. Altogether, these results demonstrate the influence of when, what and how visually induced predictions at an early stage on cortical auditory speech processing. Crucially, they indicate a preponderant predictive role of the speaker's articulatory gestures during audiovisual speech perception, likely driven by attentional load and focus.
Keywords: Audiovisual speech perception; Auditory speech perception; EEG; Predictive coding; Predictive timing.
Similar articles
-
Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.Exp Brain Res. 2017 Sep;235(9):2867-2876. doi: 10.1007/s00221-017-5018-0. Epub 2017 Jul 4. Exp Brain Res. 2017. PMID: 28676921
-
The timing of visual speech modulates auditory neural processing.Brain Lang. 2022 Dec;235:105196. doi: 10.1016/j.bandl.2022.105196. Epub 2022 Oct 28. Brain Lang. 2022. PMID: 36343508
-
Audiovisual speech asynchrony asymmetrically modulates neural binding.Neuropsychologia. 2024 Jun 6;198:108866. doi: 10.1016/j.neuropsychologia.2024.108866. Epub 2024 Mar 20. Neuropsychologia. 2024. PMID: 38518889
-
Prediction and constraint in audiovisual speech perception.Cortex. 2015 Jul;68:169-81. doi: 10.1016/j.cortex.2015.03.006. Epub 2015 Mar 20. Cortex. 2015. PMID: 25890390 Free PMC article. Review.
-
Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.Seeing Perceiving. 2011;24(6):513-39. doi: 10.1163/187847611X595864. Epub 2011 Sep 29. Seeing Perceiving. 2011. PMID: 21968081 Free PMC article. Review.
Cited by
-
Top-Down Inference in the Auditory System: Potential Roles for Corticofugal Projections.Front Neural Circuits. 2021 Jan 22;14:615259. doi: 10.3389/fncir.2020.615259. eCollection 2020. Front Neural Circuits. 2021. PMID: 33551756 Free PMC article. Review.
-
Predictive Processing in Poetic Language: Event-Related Potentials Data on Rhythmic Omissions in Metered Speech.Front Psychol. 2022 Jan 5;12:782765. doi: 10.3389/fpsyg.2021.782765. eCollection 2021. Front Psychol. 2022. PMID: 35069363 Free PMC article.
-
Effects of Closed Mouth vs. Exposed Teeth on Facial Expression Processing: An ERP Study.Behav Sci (Basel). 2025 Feb 1;15(2):163. doi: 10.3390/bs15020163. Behav Sci (Basel). 2025. PMID: 40001794 Free PMC article.
-
Face processing and early event-related potentials: replications and novel findings.Front Hum Neurosci. 2023 Oct 25;17:1268972. doi: 10.3389/fnhum.2023.1268972. eCollection 2023. Front Hum Neurosci. 2023. PMID: 37954936 Free PMC article.
References
MeSH terms
LinkOut - more resources
Full Text Sources