Electrophysiological evidence for a self-processing advantage during audiovisual speech integration
- PMID: 28676921
- DOI: 10.1007/s00221-017-5018-0
Electrophysiological evidence for a self-processing advantage during audiovisual speech integration
Abstract
Previous electrophysiological studies have provided strong evidence for early multisensory integrative mechanisms during audiovisual speech perception. From these studies, one unanswered issue is whether hearing our own voice and seeing our own articulatory gestures facilitate speech perception, possibly through a better processing and integration of sensory inputs with our own sensory-motor knowledge. The present EEG study examined the impact of self-knowledge during the perception of auditory (A), visual (V) and audiovisual (AV) speech stimuli that were previously recorded from the participant or from a speaker he/she had never met. Audiovisual interactions were estimated by comparing N1 and P2 auditory evoked potentials during the bimodal condition (AV) with the sum of those observed in the unimodal conditions (A + V). In line with previous EEG studies, our results revealed an amplitude decrease of P2 auditory evoked potentials in AV compared to A + V conditions. Crucially, a temporal facilitation of N1 responses was observed during the visual perception of self speech movements compared to those of another speaker. This facilitation was negatively correlated with the saliency of visual stimuli. These results provide evidence for a temporal facilitation of the integration of auditory and visual speech signals when the visual situation involves our own speech gestures.
Keywords: Audiovisual integration; EEG; Self recognition; Speech perception.
Similar articles
-
The impact of when, what and how predictions on auditory speech perception.Exp Brain Res. 2019 Dec;237(12):3143-3153. doi: 10.1007/s00221-019-05661-5. Epub 2019 Oct 1. Exp Brain Res. 2019. PMID: 31576421
-
Electrophysiological evidence for speech-specific audiovisual integration.Neuropsychologia. 2014 Jan;53:115-21. doi: 10.1016/j.neuropsychologia.2013.11.011. Epub 2013 Nov 27. Neuropsychologia. 2014. PMID: 24291340
-
Neural correlates of multisensory integration of ecologically valid audiovisual events.J Cogn Neurosci. 2007 Dec;19(12):1964-73. doi: 10.1162/jocn.2007.19.12.1964. J Cogn Neurosci. 2007. PMID: 17892381
-
Quantifying lip-read-induced suppression and facilitation of the auditory N1 and P2 reveals peak enhancements and delays.Psychophysiology. 2016 Sep;53(9):1295-306. doi: 10.1111/psyp.12683. Epub 2016 Jun 13. Psychophysiology. 2016. PMID: 27295181 Review.
-
Audiovisual speech integration in the superior temporal region is dysfunctional in dyslexia.Neuroscience. 2017 Jul 25;356:1-10. doi: 10.1016/j.neuroscience.2017.05.017. Epub 2017 May 18. Neuroscience. 2017. PMID: 28527953 Review.
Cited by
-
No "Self" Advantage for Audiovisual Speech Aftereffects.Front Psychol. 2019 Mar 22;10:658. doi: 10.3389/fpsyg.2019.00658. eCollection 2019. Front Psychol. 2019. PMID: 30967827 Free PMC article.
-
The Processing of Audiovisual Speech Is Linked with Vocabulary in Autistic and Nonautistic Children: An ERP Study.Brain Sci. 2023 Jul 8;13(7):1043. doi: 10.3390/brainsci13071043. Brain Sci. 2023. PMID: 37508976 Free PMC article.
-
The impact of when, what and how predictions on auditory speech perception.Exp Brain Res. 2019 Dec;237(12):3143-3153. doi: 10.1007/s00221-019-05661-5. Epub 2019 Oct 1. Exp Brain Res. 2019. PMID: 31576421
References
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous