Temporal window of integration in auditory-visual speech perception
- PMID: 16530232
- DOI: 10.1016/j.neuropsychologia.2006.01.001
Temporal window of integration in auditory-visual speech perception
Abstract
Forty-three normal hearing participants were tested in two experiments, which focused on temporal coincidence in auditory visual (AV) speech perception. In these experiments, audio recordings of/pa/and/ba/were dubbed onto video recordings of /ba/or/ga/, respectively (ApVk, AbVg), to produce the illusory "fusion" percepts /ta/, or /da/ [McGurk, H., & McDonald, J. (1976). Hearing lips and seeing voices. Nature, 264, 746-747]. In Experiment 1, an identification task using McGurk pairs with asynchronies ranging from -467 ms (auditory lead) to +467 ms was conducted. Fusion responses were prevalent over temporal asynchronies from -30 ms to +170 ms and more robust for audio lags. In Experiment 2, simultaneity judgments for incongruent and congruent audiovisual tokens (AdVd, AtVt) were collected. McGurk pairs were more readily judged as asynchronous than congruent pairs. Characteristics of the temporal window over which simultaneity and fusion responses were maximal were quite similar, suggesting the existence of a 200 ms duration asymmetric bimodal temporal integration window.
Similar articles
-
Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration.Brain Res Cogn Brain Res. 2005 Oct;25(2):499-507. doi: 10.1016/j.cogbrainres.2005.07.009. Epub 2005 Aug 31. Brain Res Cogn Brain Res. 2005. PMID: 16137867
-
The change in perceptual synchrony between auditory and visual speech after exposure to asynchronous speech.Neuroreport. 2011 Oct 5;22(14):684-8. doi: 10.1097/WNR.0b013e32834a2724. Neuroreport. 2011. PMID: 21817926
-
Recalibration of temporal order perception by exposure to audio-visual asynchrony.Brain Res Cogn Brain Res. 2004 Dec;22(1):32-5. doi: 10.1016/j.cogbrainres.2004.07.003. Brain Res Cogn Brain Res. 2004. PMID: 15561498
-
Timing in audiovisual speech perception: A mini review and new psychophysical data.Atten Percept Psychophys. 2016 Feb;78(2):583-601. doi: 10.3758/s13414-015-1026-y. Atten Percept Psychophys. 2016. PMID: 26669309 Free PMC article. Review.
-
On the 'visual' in 'audio-visual integration': a hypothesis concerning visual pathways.Exp Brain Res. 2014 Jun;232(6):1631-8. doi: 10.1007/s00221-014-3927-8. Epub 2014 Apr 4. Exp Brain Res. 2014. PMID: 24699769 Review.
Cited by
-
Direct coupling of haptic signals between hands.Proc Natl Acad Sci U S A. 2015 Jan 13;112(2):619-24. doi: 10.1073/pnas.1419539112. Epub 2014 Dec 29. Proc Natl Acad Sci U S A. 2015. PMID: 25548179 Free PMC article.
-
A neural model for temporal order judgments and their active recalibration: a common mechanism for space and time?Front Psychol. 2012 Nov 2;3:470. doi: 10.3389/fpsyg.2012.00470. eCollection 2012. Front Psychol. 2012. PMID: 23130010 Free PMC article.
-
Electrophysiological correlates of individual differences in perception of audiovisual temporal asynchrony.Neuropsychologia. 2016 Jun;86:119-30. doi: 10.1016/j.neuropsychologia.2016.04.015. Epub 2016 Apr 16. Neuropsychologia. 2016. PMID: 27094850 Free PMC article.
-
The Jena Audiovisual Stimuli of Morphed Emotional Pseudospeech (JAVMEPS): A database for emotional auditory-only, visual-only, and congruent and incongruent audiovisual voice and dynamic face stimuli with varying voice intensities.Behav Res Methods. 2024 Aug;56(5):5103-5115. doi: 10.3758/s13428-023-02249-4. Epub 2023 Oct 11. Behav Res Methods. 2024. PMID: 37821750 Free PMC article.
-
Monkeys and humans share a common computation for face/voice integration.PLoS Comput Biol. 2011 Sep;7(9):e1002165. doi: 10.1371/journal.pcbi.1002165. Epub 2011 Sep 29. PLoS Comput Biol. 2011. PMID: 21998576 Free PMC article.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources