Seeing speech and seeing sign: insights from a fMRI study
- PMID: 19012106
- DOI: 10.1080/14992020802233907
Seeing speech and seeing sign: insights from a fMRI study
Abstract
In a single study, silent speechreading and signed language processing were investigated using fMRI. Deaf native signers of British sign language (BSL) who were also proficient speechreaders of English were the focus of the research. Separate analyses contrasted different aspects of the data. In the first place, we found that the left superior temporal cortex, including auditory regions, was strongly activated in the brains of deaf compared with hearing participants when processing silently spoken (speechread) word lists. In the second place, we found that within the signed language, cortical activation patterns reflected the presence and type of mouth action that accompanied the manual sign. Signed items that incorporated oral as well as manual actions were distinguished from signs using only manual actions. Signs that used speechlike oral actions could be differentiated from those that did not. Thus, whether in speechreading or in sign language processing, speechlike mouth actions differentially activated regions of the superior temporal lobe that are accounted auditory association cortex in hearing people. One inference is that oral actions that are speechlike may have preferential access to 'auditory speech' parts of the left superior temporal cortex in deaf people. This could occur not only when deaf people were reading speech, but also when they were processing a signed language. For the deaf child, it is likely that observation of speech helps to construct and to constrain the parameters of spoken language acquisition. This has implications for programmes of intervention and therapy for cochlear implantation.
Similar articles
-
Dissociating linguistic and nonlinguistic gestural communication in the brain.Neuroimage. 2004 Aug;22(4):1605-18. doi: 10.1016/j.neuroimage.2004.03.015. Neuroimage. 2004. PMID: 15275917
-
Neural correlates of British sign language comprehension: spatial processing demands of topographic language.J Cogn Neurosci. 2002 Oct 1;14(7):1064-75. doi: 10.1162/089892902320474517. J Cogn Neurosci. 2002. PMID: 12419129
-
Sign and speech: amodal commonality in left hemisphere dominance for comprehension of sentences.Brain. 2005 Jun;128(Pt 6):1407-17. doi: 10.1093/brain/awh465. Epub 2005 Feb 23. Brain. 2005. PMID: 15728651
-
The signed reading fluency of students who are deaf/hard of hearing.J Deaf Stud Deaf Educ. 2008 Winter;13(1):37-54. doi: 10.1093/deafed/enm030. Epub 2007 Jul 2. J Deaf Stud Deaf Educ. 2008. PMID: 17607020 Review.
-
Role of speechreading in audiovisual interactions during the recovery of speech comprehension in deaf adults with cochlear implants.Scand J Psychol. 2009 Oct;50(5):437-44. doi: 10.1111/j.1467-9450.2009.00741.x. Scand J Psychol. 2009. PMID: 19778391 Review.
Cited by
-
Evolution of crossmodal reorganization of the voice area in cochlear-implanted deaf patients.Hum Brain Mapp. 2012 Aug;33(8):1929-40. doi: 10.1002/hbm.21331. Epub 2011 May 6. Hum Brain Mapp. 2012. PMID: 21557388 Free PMC article.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Medical