Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Mar 3:11:309.
doi: 10.3389/fpsyg.2020.00309. eCollection 2020.

Expanding Simulation Models of Emotional Understanding: The Case for Different Modalities, Body-State Simulation Prominence, and Developmental Trajectories

Affiliations
Review

Expanding Simulation Models of Emotional Understanding: The Case for Different Modalities, Body-State Simulation Prominence, and Developmental Trajectories

Paddy Ross et al. Front Psychol. .

Abstract

Recent models of emotion recognition suggest that when people perceive an emotional expression, they partially activate the respective emotion in themselves, providing a basis for the recognition of that emotion. Much of the focus of these models and of their evidential basis has been on sensorimotor simulation as a basis for facial expression recognition - the idea, in short, that coming to know what another feels involves simulating in your brain the motor plans and associated sensory representations engaged by the other person's brain in producing the facial expression that you see. In this review article, we argue that simulation accounts of emotion recognition would benefit from three key extensions. First, that fuller consideration be given to simulation of bodily and vocal expressions, given that the body and voice are also important expressive channels for providing cues to another's emotional state. Second, that simulation of other aspects of the perceived emotional state, such as changes in the autonomic nervous system and viscera, might have a more prominent role in underpinning emotion recognition than is typically proposed. Sensorimotor simulation models tend to relegate such body-state simulation to a subsidiary role, despite the plausibility of body-state simulation being able to underpin emotion recognition in the absence of typical sensorimotor simulation. Third, that simulation models of emotion recognition be extended to address how embodied processes and emotion recognition abilities develop through the lifespan. It is not currently clear how this system of sensorimotor and body-state simulation develops and in particular how this affects the development of emotion recognition ability. We review recent findings from the emotional body recognition literature and integrate recent evidence regarding the development of mimicry and interoception to significantly expand simulation models of emotion recognition.

Keywords: body-state simulation; development; emotion recognition; interoception; sensorimotor simulation.

PubMed Disclaimer

Figures

FIGURE 1
FIGURE 1
Simulation model of emotion recognition from the face. Processing of a particular facial expression triggers other components of the emotion system, leading ultimately to emotional understanding. It should be noted that arrows in this model do not imply neural modularity and specific sequential events; rather they emphasize the distributed and recursive nature of the emotion perception process. This is an amended version of a figure first published in Wood et al. (2016b). Copyright (2016) reprinted with permission from Elsevier (License No. 4626470808358).
FIGURE 2
FIGURE 2
Distribution of lesion overlaps for emotion recognition tasks in studies by Adolphs and colleagues. (A) Distribution of lesion overlaps from all 108 subjects in Adolphs et al. (2000) study as a function of mean recognition performance on an emotion-rating task for static facial expressions. Red regions correspond to locations at which lesions resulted in impairment on the task more often than not, and blue regions correspond to locations at which lesions resulted in normal performance more often than not. (B) Distribution of lesion overlaps from all 108 subjects in Adolphs et al. (2000) study as a function of mean performance on tasks requiring either choosing the name of the facially expressed emotion or sorting the expressions into categories without requiring naming. Color coding as for (A). (C) Distribution of lesion overlaps from Adolphs et al. (2002) study as a function of mean performance on an emotion-rating task for prosodic expressions, for the 33 individuals with the most abnormal ratings (red) compared with the 33 with the most normal ratings (blue). (D) Distribution of lesion overlaps from Heberlein et al. (2004) study for the subjects who were impaired at emotion recognition from the point-light walker stimuli (>2 SD below normal control mean). Figures (A,B): Copyright 2000 Society for Neuroscience. Figure (C): Copyright 2002 American Psychological Association. Figure (D): Copyright 2004 MIT Press.
FIGURE 3
FIGURE 3
Somatosensory brain regions implicated in emotion judgments. (A) Renderings on a slightly inflated standard brain of post-central gyrus (orange), parietal operculum (red, corresponding approximately to S2/OP1), and insula (chartreuse yellow) in the right hemisphere, as delineated using the Harvard-Oxford Atlas (Desikan et al., 2006). (B) The same image as in (A) but rotated slightly to reveal more of S2/OP1 and insula. (C) The same anatomical regions rendered on a non-inflated standard brain image, to show how particularly insula and S2 are largely hidden, located away from the outermost surface of the brain (and thus skull). (D) Primary somatosensory cortex (S1: orange, green, yellow) and secondary somatosensory cortex (S2/OP1: red) as delineated using the probabilistic atlas from the Jülich SPM Anatomy Toolbox (Eickhoff et al., 2005). (E) Plotted mean coordinates, from the studies discussed in the text, of target locations for transcranial magnetic stimulation (TMS) to disrupt emotion recognition. Blue dots indicate studies that used facial expressions, red dots indicate studies that used vocal expressions, and the green dot indicates a study that used (point-light) bodily expressions. The numbers next to the dots refer to the relevant studies, as noted in the table below; ‘+’ indicates effect of TMS on emotion perception task performance; ‘−’ indicates no effect. (F) Plotted mean coordinates, from the studies discussed in the text, of the fMRI activation peaks in somatosensory cortices for explicit emotion judgments compared to incidental emotion processing. The blue dot indicates a study that used facial expressions, green dots indicate studies that used bodily expressions, and the pink dot indicates a study that used both bodily and facial expressions [point-light displays (PLDs)]. The numbers above the dots refer to the relevant studies, as noted in the table below. (G) A group statistical non-parametric map (SnPM) for emotion judgments > color judgments on point-light body and face stimuli; unpublished data from Atkinson et al. (2012). The SnPM is thresholded at q < 0.05, FDR-corrected (≥10 contiguous voxels). The bottom row in (F) is the same as the top row except for a slight rotation to reveal more of the activations in bilateral SMG/parietal operculum (including S2). For all the images in this figure, the anatomical regions, coordinate markers and fMRI activations were mapped on to a partially inflated ICBM152 standard brain in MNI space using the BrainNet Viewer software (Xia et al., 2013).

Similar articles

Cited by

References

    1. Addabbo M., Vacaru S. V., Meyer M., Hunnius S. (2019). Something in the way you move: infants are sensitive to emotions conveyed in action kinematics. Dev. Sci. 23:e12873. 10.1111/desc.12873 - DOI - PubMed
    1. Adolphs R. (2002). Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav. Cogn. Neurosci. Rev. 1 21–62. 10.1177/1534582302001001003 - DOI - PubMed
    1. Adolphs R., Damasio H., Tranel D. (2002). Neural systems for recognition of emotional prosody: a 3-D lesion study. Emotion 2 23–51. - PubMed
    1. Adolphs R., Damasio H., Tranel D., Cooper G., Damasio A. R. (2000). A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping. J. Neurosci. 20 2683–2690. 10.1523/JNEUROSCI.20-07-02683.2000 - DOI - PMC - PubMed
    1. Adolphs R., Tranel D., Damasio A. R. (2003). Dissociable neural systems for recognizing emotions. Brain Cogn. 52 61–69. 10.1016/S0278-2626(03)00009-5 - DOI - PubMed

LinkOut - more resources