Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 Dec 8;106(49):20664-9.
doi: 10.1073/pnas.0909197106. Epub 2009 Nov 18.

Symbolic gestures and spoken language are processed by a common neural system

Affiliations

Symbolic gestures and spoken language are processed by a common neural system

Jiang Xu et al. Proc Natl Acad Sci U S A. .

Abstract

Symbolic gestures, such as pantomimes that signify actions (e.g., threading a needle) or emblems that facilitate social transactions (e.g., finger to lips indicating "be quiet"), play an important role in human communication. They are autonomous, can fully take the place of words, and function as complete utterances in their own right. The relationship between these gestures and spoken language remains unclear. We used functional MRI to investigate whether these two forms of communication are processed by the same system in the human brain. Responses to symbolic gestures, to their spoken glosses (expressing the gestures' meaning in English), and to visually and acoustically matched control stimuli were compared in a randomized block design. General Linear Models (GLM) contrasts identified shared and unique activations and functional connectivity analyses delineated regional interactions associated with each condition. Results support a model in which bilateral modality-specific areas in superior and inferior temporal cortices extract salient features from vocal-auditory and gestural-visual stimuli respectively. However, both classes of stimuli activate a common, left-lateralized network of inferior frontal and posterior temporal regions in which symbolic gestures and spoken words may be mapped onto common, corresponding conceptual representations. We suggest that these anterior and posterior perisylvian areas, identified since the mid-19th century as the core of the brain's language system, are not in fact committed to language processing, but may function as a modality-independent semiotic system that plays a broader role in human communication, linking meaning with symbols whether these are words, gestures, images, sounds, or objects.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Fig. 1.
Fig. 1.
Examples of pantomimes (top two rows. English glosses–A. juggle balls; B. unscrew jar) and emblems (bottom two rows. English glosses–C. I've got it!; D. settle down).
Fig. 2.
Fig. 2.
Common areas of activation for processing symbolic gestures and spoken language minus their respective baselines, identified using a random effects conjunction analysis. The resultant t map is rendered on a single subject T1 image: 3D surface rendering above, axial slices with associated z axis coordinates, below.
Fig. 3.
Fig. 3.
Condition-specific activations for symbolic gesture (Top) and for speech (Bottom) were estimated by contrasting symbolic gestures and spoken language (minus their respective baselines) using random effects, paired two-sample t tests. Maps were rendered as indicated as in Fig. 2.
Fig. 4.
Fig. 4.
Functional connections between left hemisphere seed regions (IFG; MTG) and other brain areas. (A) Correlations that exceeded threshold for both symbolic gesture and spoken language and did not differ significantly between these conditions. Unique correlations, illustrated in (B) for speech and (C) for gesture, exceeded threshold in only one of these conditions. Lines connote z-transformed correlation coefficients >3 that met criteria. (STG, superior temporal gyrus; FUS, inferior temporal and fusiform gyri; PFC, dorsolateral prefrontal cortex; PHP, parahippocampal gyrus; CBH, cerebellar hemisphere).

References

    1. Suppalla T. Revisiting visual analogy in ASL classifier predicates. In: Emmorey K, editor. Perspectives on classifier constructions in sign language. Mahwah, NJ: Lawrence Erlbaum Assoc; 2003. pp. 249–527.
    1. Petitto LA. On the autonomy of language and gesture: Evidence from the acquisition of personal pronouns in American Sign Language. Cognition. 1987;27:1–52. - PubMed
    1. McNeill D. Hand and Mind. Chicago: University of Chicago Press; 1992. pp. 36–72.
    1. Kendon A. How gestures can become like words. In: Poyatos F, editor. Crosscultural Perspectives in Nonverbal Communication. Toronto: Hogrefe; 1988. pp. 131–141.
    1. Studdert-Kennedy M. The phoneme as a perceptuomotor structure. In: Allport A, MacKay D, Prinz W, Scheerer E, editors. Language Perception and Production. London: Academic; 1987. pp. 67–84.

Publication types

LinkOut - more resources