Visual influences on alignment to voice onset time
- PMID: 20220027
- DOI: 10.1044/1092-4388(2009/08-0247)
Visual influences on alignment to voice onset time
Abstract
Purpose: Speech shadowing experiments were conducted to test whether alignment (inadvertent imitation) to voice onset time (VOT) can be influenced by visual speech information.
Method: Experiment 1 examined whether alignment would occur to auditory /pa/ syllables manipulated to have 3 different VOTs. Nineteen female participants were asked to listen to 180 syllables over headphones and to say each syllable out loud quickly and clearly. In Experiment 2, visual speech tokens composed of a face articulating /pa/ syllables at 2 different rates were dubbed onto the audio /pa/ syllables of Experiment 1. Sixteen new female participants were asked to listen to and watch (over a video monitor) 180 syllables and to say each syllable out loud quickly and clearly.
Results: Results of Experiment 1 showed that the 3 VOTs of the audio /pa/ stimuli influenced the VOTs of the participants' produced syllables. Results of Experiment 2 revealed that both the visible syllable rate and audio VOT of the audiovisual /pa/ stimuli influenced the VOTs of the participants' produced syllables.
Conclusion: These results show that, like auditory speech, visual speech information can induce speech alignment to a phonetically relevant property of an utterance.
Similar articles
-
Audio-visual interactions with intact clearly audible speech.Q J Exp Psychol A. 2004 Aug;57(6):1103-21. doi: 10.1080/02724980343000701. Q J Exp Psychol A. 2004. PMID: 15370518
-
Impact of language on development of auditory-visual speech perception.Dev Sci. 2008 Mar;11(2):306-20. doi: 10.1111/j.1467-7687.2008.00677.x. Dev Sci. 2008. PMID: 18333984
-
The effect of voice-onset-time on dichotic listening with consonant-vowel syllables.Neuropsychologia. 2006;44(2):191-6. doi: 10.1016/j.neuropsychologia.2005.05.006. Epub 2005 Jul 14. Neuropsychologia. 2006. PMID: 16023155 Clinical Trial.
-
Alignment to visual speech information.Atten Percept Psychophys. 2010 Aug;72(6):1614-25. doi: 10.3758/APP.72.6.1614. Atten Percept Psychophys. 2010. PMID: 20675805
-
The influence of visual and auditory information on the perception of speech and non-speech oral movements in patients with left hemisphere lesions.Clin Linguist Phon. 2009 Mar;23(3):208-21. doi: 10.1080/02699200802399913. Clin Linguist Phon. 2009. PMID: 19283578
Cited by
-
Is speech alignment to talkers or tasks?Atten Percept Psychophys. 2013 Nov;75(8):1817-26. doi: 10.3758/s13414-013-0517-y. Atten Percept Psychophys. 2013. PMID: 23907619 Free PMC article.
-
Prediction and imitation in speech.Front Psychol. 2013 Jun 21;4:340. doi: 10.3389/fpsyg.2013.00340. eCollection 2013. Front Psychol. 2013. PMID: 23801971 Free PMC article.
-
Measuring phonetic convergence in speech production.Front Psychol. 2013 Aug 27;4:559. doi: 10.3389/fpsyg.2013.00559. eCollection 2013. Front Psychol. 2013. PMID: 23986738 Free PMC article.
-
Visibility of speech articulation enhances auditory phonetic convergence.Atten Percept Psychophys. 2016 Jan;78(1):317-33. doi: 10.3758/s13414-015-0982-6. Atten Percept Psychophys. 2016. PMID: 26358471 Free PMC article.
-
Semantic priming from McGurk words: Priming depends on perception.Atten Percept Psychophys. 2023 May;85(4):1219-1237. doi: 10.3758/s13414-023-02689-2. Epub 2023 Apr 25. Atten Percept Psychophys. 2023. PMID: 37155085
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources