Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 Nov;30(11):3509-26.
doi: 10.1002/hbm.20774.

Co-speech gestures influence neural activity in brain regions associated with processing semantic information

Affiliations

Co-speech gestures influence neural activity in brain regions associated with processing semantic information

Anthony Steven Dick et al. Hum Brain Mapp. 2009 Nov.

Abstract

Everyday communication is accompanied by visual information from several sources, including co-speech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Time‐course of the experiment for one of the two runs. Shown are still pictures from videos corresponding to each of the language conditions. The blank screen with a fixation cross was shown during the No‐Visual‐Input and the Baseline conditions.
Figure 2
Figure 2
Whole‐brain analysis results for each condition compared to Baseline. The individual per‐voxel threshold was P < 0.01 (corrected FWE P < 0.05).
Figure 3
Figure 3
(A) Results of the intersection analysis for conditions with hand movements relative to no hand movements. G, Gesture; S‐A, Self‐Adaptor; N‐H‐M, No‐Hand‐Movement; N‐V‐I, No‐Visual‐Input. (B) Results of the analysis for the posterior superior temporal sulcus (STSp) region of interest. *** P < 0.001.
Figure 4
Figure 4
(A) Results of the whole‐brain comparison between Self‐Adaptor and Gesture. Figure shows the left and right hemispheres (no difference was detected on the left). (B) Results of the analysis for inferior frontal regions of interest. Comparisons were conducted only between Gesture and all other conditions (differences between Self‐Adaptor and No‐Hand‐Movement and Self‐Adaptor and No‐Visual‐Input, and between No‐Hand‐Movement and No‐Visual‐Input, were not a focus and were not statistically assessed). Left pars triangularis (IFGTr) was sensitive to gestures relative to conditions without hand movements. Both right IFGTr and pars opercularis (IFGOp) were more active for Self‐Adaptor than for Gesture, but notably, analyses of the interaction assessing the moderating influence of hemisphere was significant only for IFGTr.

Similar articles

Cited by

References

    1. Argall BD,Saad ZS,Beauchamp MS ( 2006): Simplified intersubject averaging on the cortical surface using SUMA. Hum Brain Mapp 27: 14–27. - PMC - PubMed
    1. Aron AR,Robbins TW,Poldrack RA ( 2004): Inhibition and the right inferior frontal cortex. Trends Cognit Sci 8: 170–177. - PubMed
    1. Badre D,Poldrack RA,Pare‐Blagoev EJ,Insler RZ,Wagner AD ( 2005): Dissociable controlled retrieval and generalized selection mechanisms in ventrolateral prefrontal cortex. Neuron 47: 907–918. - PubMed
    1. Barraclough NE,Xiao D,Baker CI,Oram MW,Perrett DI ( 2005): Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions. J Cognit Neurosci 17: 377–391. - PubMed
    1. Beattie G,Shovelton H ( 1999): Mapping the range of information contained in the iconic hand gestures that accompany spontaneous speech. J Lang Soc Psychol 18: 438–462.

Publication types