On-line integration of semantic information from speech and gesture: insights from event-related brain potentials
- PMID: 17381252
- DOI: 10.1162/jocn.2007.19.4.605
On-line integration of semantic information from speech and gesture: insights from event-related brain potentials
Abstract
During language comprehension, listeners use the global semantic representation from previous sentence or discourse context to immediately integrate the meaning of each upcoming word into the unfolding message-level representation. Here we investigate whether communicative gestures that often spontaneously co-occur with speech are processed in a similar fashion and integrated to previous sentence context in the same way as lexical meaning. Event-related potentials were measured while subjects listened to spoken sentences with a critical verb (e.g., knock), which was accompanied by an iconic co-speech gesture (i.e., KNOCK). Verbal and/or gestural semantic content matched or mismatched the content of the preceding part of the sentence. Despite the difference in the modality and in the specificity of meaning conveyed by spoken words and gestures, the latency, amplitude, and topographical distribution of both word and gesture mismatches are found to be similar, indicating that the brain integrates both types of information simultaneously. This provides evidence for the claim that neural processing in language comprehension involves the simultaneous incorporation of information coming from a broader domain of cognition than only verbal semantics. The neural evidence for similar integration of information from speech and gesture emphasizes the tight interconnection between speech and co-speech gestures.
Similar articles
-
The role of iconic gestures in speech disambiguation: ERP evidence.J Cogn Neurosci. 2007 Jul;19(7):1175-92. doi: 10.1162/jocn.2007.19.7.1175. J Cogn Neurosci. 2007. PMID: 17583993
-
Neural correlates of bimodal speech and gesture comprehension.Brain Lang. 2004 Apr;89(1):253-60. doi: 10.1016/S0093-934X(03)00335-3. Brain Lang. 2004. PMID: 15010257
-
Neural correlates of the processing of co-speech gestures.Neuroimage. 2008 Feb 15;39(4):2010-24. doi: 10.1016/j.neuroimage.2007.10.055. Epub 2007 Nov 13. Neuroimage. 2008. PMID: 18093845
-
Neural mechanisms of language comprehension: challenges to syntax.Brain Res. 2007 May 18;1146:23-49. doi: 10.1016/j.brainres.2006.12.063. Epub 2006 Dec 23. Brain Res. 2007. PMID: 17400197 Review.
-
Hearing and seeing meaning in speech and gesture: insights from brain and behaviour.Philos Trans R Soc Lond B Biol Sci. 2014 Sep 19;369(1651):20130296. doi: 10.1098/rstb.2013.0296. Philos Trans R Soc Lond B Biol Sci. 2014. PMID: 25092664 Free PMC article. Review.
Cited by
-
Neural integration of speech and gesture in schizophrenia: evidence for differential processing of metaphoric gestures.Hum Brain Mapp. 2013 Jul;34(7):1696-712. doi: 10.1002/hbm.22015. Epub 2012 Feb 29. Hum Brain Mapp. 2013. PMID: 22378493 Free PMC article.
-
Semantic Relationships Between Representational Gestures and Their Lexical Affiliates Are Evaluated Similarly for Speech and Text.Front Psychol. 2020 Oct 22;11:575991. doi: 10.3389/fpsyg.2020.575991. eCollection 2020. Front Psychol. 2020. PMID: 33192884 Free PMC article.
-
Gesture in the developing brain.Dev Sci. 2012 Mar;15(2):165-80. doi: 10.1111/j.1467-7687.2011.01100.x. Epub 2011 Nov 2. Dev Sci. 2012. PMID: 22356173 Free PMC article.
-
Gesture's body orientation modulates the N400 for visual sentences primed by gestures.Hum Brain Mapp. 2020 Dec;41(17):4901-4911. doi: 10.1002/hbm.25166. Epub 2020 Aug 18. Hum Brain Mapp. 2020. PMID: 32808721 Free PMC article.
-
I tawt i taw a puddy tat: Gestures in canary row narrations by high-functioning youth with autism spectrum disorder.Autism Res. 2017 Aug;10(8):1353-1363. doi: 10.1002/aur.1785. Epub 2017 Apr 1. Autism Res. 2017. PMID: 28371492 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources