Sign and Spoken Language Processing Differences in the Brain: A Brief Review of Recent Research
- PMID: 35875424
- PMCID: PMC9305909
- DOI: 10.1177/09727531211070538
Sign and Spoken Language Processing Differences in the Brain: A Brief Review of Recent Research
Abstract
Background: It is currently accepted that sign languages and spoken languages have significant processing commonalities. The evidence supporting this often merely investigates frontotemporal pathways, perisylvian language areas, hemispheric lateralization, and event-related potentials in typical settings. However, recent evidence has explored beyond this and uncovered numerous modality-dependent processing differences between sign languages and spoken languages by accounting for confounds that previously invalidated processing comparisons and by delving into the specific conditions in which they arise. However, these processing differences are often shallowly dismissed as unspecific to language.
Summary: This review examined recent neuroscientific evidence for processing differences between sign and spoken language modalities and the arguments against these differences' importance. Key distinctions exist in the topography of the left anterior negativity (LAN) and with modulations of event-related potential (ERP) components like the N400. There is also differential activation of typical spoken language processing areas, such as the conditional role of the temporal areas in sign language (SL) processing. Importantly, sign language processing uniquely recruits parietal areas for processing phonology and syntax and requires the mapping of spatial information to internal representations. Additionally, modality-specific feedback mechanisms distinctively involve proprioceptive post-output monitoring in sign languages, contrary to spoken languages' auditory and visual feedback mechanisms. The only study to find ERP differences post-production revealed earlier lexical access in sign than spoken languages. Themes of temporality, the validity of an analogous anatomical mechanisms viewpoint, and the comprehensiveness of current language models were also discussed to suggest improvements for future research.
Key message: Current neuroscience evidence suggests various ways in which processing differs between sign and spoken language modalities that extend beyond simple differences between languages. Consideration and further exploration of these differences will be integral in developing a more comprehensive view of language in the brain.
Keywords: Cognitive neuroscience; Language comprehension; Language production; N400; Parietal; Sign language; Spoken language.
© 2022 Indian Academy of Neurosciences (IAN).
Conflict of interest statement
Declaration of Conflicting Interests: The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Similar articles
-
How sensory-motor systems impact the neural organization for language: direct contrasts between spoken and signed language.Front Psychol. 2014 May 27;5:484. doi: 10.3389/fpsyg.2014.00484. eCollection 2014. Front Psychol. 2014. PMID: 24904497 Free PMC article.
-
New Perspectives on the Neurobiology of Sign Languages.Front Commun (Lausanne). 2021 Dec;6:748430. doi: 10.3389/fcomm.2021.748430. Epub 2021 Dec 13. Front Commun (Lausanne). 2021. PMID: 36381199 Free PMC article.
-
Electrophysiological correlates of cross-linguistic semantic integration in hearing signers: N400 and LPC.Neuropsychologia. 2014 Jul;59:57-73. doi: 10.1016/j.neuropsychologia.2014.04.011. Epub 2014 Apr 19. Neuropsychologia. 2014. PMID: 24751994
-
Visual and linguistic components of short-term memory: Generalized Neural Model (GNM) for spoken and sign languages.Cortex. 2019 Mar;112:69-79. doi: 10.1016/j.cortex.2018.05.020. Epub 2018 Jun 7. Cortex. 2019. PMID: 30001920 Review.
-
One grammar or two? Sign Languages and the Nature of Human Language.Wiley Interdiscip Rev Cogn Sci. 2014 Jul;5(4):387-401. doi: 10.1002/wcs.1297. Wiley Interdiscip Rev Cogn Sci. 2014. PMID: 25013534 Free PMC article. Review.
Cited by
-
Sign language experience has little effect on face and biomotion perception in bimodal bilinguals.Sci Rep. 2023 Sep 15;13(1):15328. doi: 10.1038/s41598-023-41636-x. Sci Rep. 2023. PMID: 37714887 Free PMC article.
References
-
- Campbell R, MacSweeney M, and Waters D.. Sign language and the brain: A review. J Deaf Stud Deaf Educ 2008; 13: 3–20. - PubMed