Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2010 Jul 28;30(30):10127-34.
doi: 10.1523/JNEUROSCI.2161-10.2010.

Supramodal representations of perceived emotions in the human brain

Affiliations

Supramodal representations of perceived emotions in the human brain

Marius V Peelen et al. J Neurosci. .

Abstract

Basic emotional states (such as anger, fear, and joy) can be similarly conveyed by the face, the body, and the voice. Are there human brain regions that represent these emotional mental states regardless of the sensory cues from which they are perceived? To address this question, in the present study participants evaluated the intensity of emotions perceived from face movements, body movements, or vocal intonations, while their brain activity was measured with functional magnetic resonance imaging (fMRI). Using multivoxel pattern analysis, we compared the similarity of response patterns across modalities to test for brain regions in which emotion-specific patterns in one modality (e.g., faces) could predict emotion-specific patterns in another modality (e.g., bodies). A whole-brain searchlight analysis revealed modality-independent but emotion category-specific activity patterns in medial prefrontal cortex (MPFC) and left superior temporal sulcus (STS). Multivoxel patterns in these regions contained information about the category of the perceived emotions (anger, disgust, fear, happiness, sadness) across all modality comparisons (face-body, face-voice, body-voice), and independently of the perceived intensity of the emotions. No systematic emotion-related differences were observed in the overall amplitude of activation in MPFC or STS. These results reveal supramodal representations of emotions in high-level brain areas previously implicated in affective processing, mental state attribution, and theory-of-mind. We suggest that MPFC and STS represent perceived emotions at an abstract, modality-independent level, and thus play a key role in the understanding and categorization of others' emotional mental states.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Schematic overview of the design. Each run consisted of 3 blocks of 12 trials each. Blocks differed in the type of stimuli presented (bodies, faces, or voices). On each trial, one of five different emotions could be expressed in a given modality. These trials were presented in random order within each block, whereas the order of blocks was counterbalanced across runs. Each trial consisted of a 2 s fixation cross, followed by a 3 s stimulus (movie or sound clip), a 1 s blank screen, and then a 2.5 s response window. After the presentation of each stimulus, participants were asked to rate the intensity of the perceived emotion on a 3-point scale. Not drawn to scale.
Figure 2.
Figure 2.
Average intensity ratings for each experimental condition. ang, Anger; dis, disgust; fea, fear; hap, happiness; sad, sadness.
Figure 3.
Figure 3.
Results of a whole-brain searchlight analysis showing clusters with significant emotion-specific activity patterns across modality. Similarity of activity patterns was expressed as a correlation value, with higher correlations indicating higher similarity. Patterns were generally more similar within emotion categories (green bars) than between emotion categories (blue bars). Top row: Cluster in MPFC; peak (x, y, z): 11, 48, 17; t17 = 6.0, p = 0.00001, and average Fisher-transformed correlation values in MPFC cluster for the three cross-modality comparisons. Correlations were higher for within- than between-emotion comparisons for all three modality combinations (p < 0.05, for all tests). Bottom row: Cluster in left STS; peak (x, y, z): −47, −62, 8, t17 = 6.4, p < 0.00001, and average Fisher-transformed correlation values in STS cluster for the three cross-modality comparisons. Correlations were higher for within- than between-emotion comparisons for all three modality combinations (p < 0.05, for all tests). ang, Anger; dis, disgust; fea, fear; hap, happiness; sad, sadness; all, average across the five emotions. Error bars indicate within-subject SEM.
Figure 4.
Figure 4.
Mean activation (parameter estimates) in MPFC (left) and STS (right) searchlight clusters for each experimental condition. Abbreviations are as in Figure 2.

Comment in

Similar articles

Cited by

References

    1. Adolphs R. The social brain: neural basis of social knowledge. Annu Rev Psychol. 2009;60:693–716. - PMC - PubMed
    1. Allison T, Puce A, McCarthy G. Social perception from visual cues: role of the STS region. Trends Cogn Sci. 2000;4:267–278. - PubMed
    1. Anderson AK, Christoff K, Stappen I, Panitz D, Ghahremani DG, Glover G, Gabrieli JD, Sobel N. Dissociated neural representations of intensity and valence in human olfaction. Nat Neurosci. 2003;6:196–202. - PubMed
    1. Atkinson AP, Dittrich WH, Gemmell AJ, Young AW. Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception. 2004;33:717–746. - PubMed
    1. Atkinson AP, Tunstall ML, Dittrich WH. Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures. Cognition. 2007;104:59–72. - PubMed

Publication types

LinkOut - more resources