Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Dec:169:35-49.
doi: 10.1016/j.cortex.2023.08.014. Epub 2023 Sep 30.

Inability to move one's face dampens facial expression perception

Affiliations

Inability to move one's face dampens facial expression perception

Shruti Japee et al. Cortex. 2023 Dec.

Abstract

Humans rely heavily on facial expressions for social communication to convey their thoughts and emotions and to understand them in others. One prominent but controversial view is that humans learn to recognize the significance of facial expressions by mimicking the expressions of others. This view predicts that an inability to make facial expressions (e.g., facial paralysis) would result in reduced perceptual sensitivity to others' facial expressions. To test this hypothesis, we developed a diverse battery of sensitive emotion recognition tasks to characterize expression perception in individuals with Moebius Syndrome (MBS), a congenital neurological disorder that causes facial palsy. Using computer-based detection tasks we systematically assessed expression perception thresholds for static and dynamic face and body expressions. We found that while MBS individuals were able to perform challenging perceptual control tasks and body expression tasks, they were less efficient at extracting emotion from facial expressions, compared to matched controls. Exploratory analyses of fMRI data from a small group of MBS participants suggested potentially reduced engagement of the amygdala in MBS participants during expression processing relative to matched controls. Collectively, these results suggest a role for facial mimicry and consequent facial feedback and motor experience in the perception of others' facial expressions.

Trial registration: ClinicalTrials.gov NCT02055248 NCT00001360.

Keywords: Emotion perception; Facial experience; Facial expressions; Facial feedback; Facial mimicry.

PubMed Disclaimer

Conflict of interest statement

Declaration of competing interest The authors declare no competing interests.

Figures

Figure 1.
Figure 1.. Static Facial Expression Task.
A. Illustration of how stimuli were created by morphing a neutral face to its corresponding happy or fearful face in increments of 5% yielding 21 images for each emotion from 100% neutral to 100% happy or fearful. B. Depiction of the trial structure – each trial began with a 250ms fixation cross, followed by the happy or fearful morph image (in separate runs) for 350ms, and a 1.5s response window during which participants were to indicate if they thought the face was happy or neutral (or fearful or neutral in separate runs) during emotion task runs, or whether the mouth was open or closed during control task runs. C. Box and whisker plots showing the thresholds for each task for control participants in blue and MBS individuals in red. MBS individuals had similar thresholds as control participants on the control task but higher thresholds for the emotion task (*p < 0.01; *** p < 10−6; ns: no significant difference). KDEF images used in Figure 1 panels A and B are reproduced from KDEF stimulus database - Lundqvist et al., 1998 (https://www.kdef.se/home/aboutKDEF.html), with permission from Karolinska Institutet, Psychology section, Copyright year: 1998, Copyright holder: Karolinska Institutet, Psychology section.
Figure 2.
Figure 2.. Identity and Expression Matching Task.
A. Images of 8 different facial identities and 4 different expressions (happy, fearful, angry, and neutral) from 3 different viewpoints (left, front, and right) were used as stimuli. B. Depiction of the trial structure – each trial began with a 100ms fixation cross, followed by the presentation of three faces in a triangle format for 2s, during which participants were to indicate which of the two bottom faces (or neither) matched the top face either on identity or expression (in separate runs). C. Box and whisker plots showing performance accuracy for identity and expression matching for control participants in blue and MBS individuals in red. MBS individuals had similar performance as control participants on the identity matching task but worse performance on the expression matching task (*p < 0.01; ns: no significant difference; chance performance on this task = 33%). KDEF images used in Figure 2 panels A and B are reproduced from KDEF stimulus database - Lundqvist et al., 1998 (https://www.kdef.se/home/aboutKDEF.html), with permission from Karolinska Institutet, Psychology section, Copyright year: 1998, Copyright holder: Karolinska Institutet, Psychology section.
Figure 3.
Figure 3.. Dynamic Facial Expression Task.
A. Videos of 8 different durations ranging from 200ms to 1.6s depicting dynamic facial expressions of happiness, fear, and anger were used as stimuli. B. Depiction of the trial structure – each trial began with a 500ms fixation cross, followed by a happy, fearful, or angry video of varying duration (200ms, 400ms, 600ms, 800ms, 1s, 1.2s, 1.4s, or 1.6s), and a 1s response window during which participants were to indicate if they thought the video depicted a happy, fearful, angry, or neutral expression during the emotion task runs, or whether the mouth moved or not during the facial motion control task runs. C. Box and whisker plots showing the thresholds for each task for control participants in blue and MBS individuals in red. MBS individuals had higher thresholds than controls participants for both facial motion detection and emotion categorization tasks (**p < 0.001; *** p < 10−6). KDEF images used in Figure 3 panels A and B are reproduced from KDEF stimulus database - Lundqvist et al., 1998 (https://www.kdef.se/home/aboutKDEF.html), with permission from Karolinska Institutet, Psychology section, Copyright year: 1998, Copyright holder: Karolinska Institutet, Psychology section.
Figure 4.
Figure 4.. Dynamic Body Expression Task.
A. Videos of 12 different durations ranging from 200ms to 2.4s depicting dynamic body expressions of happiness, fear, and anger were used as stimuli. B. Depiction of the trial structure – each trial began with a 500ms fixation cross, followed by a happy, fearful, or angry video of varying duration (200ms, 400ms, 600ms, 800ms, 1s, 1.2s, 1.4s, 1.6s, 1.8s, 2.0s, 2.2s, or 2.4s), and a 1s response window during which participants were to indicate if they thought the video depicted a happy, fearful, angry, or neutral expression during the emotion task runs, or whether the arms moved or not during the body motion control task runs. C. Graph showing the emotion thresholds for each task for control participants in blue and MBS individuals in red. MBS individuals had similar thresholds as control participants for body motion detection (control task) and for two of the three body emotion detection tasks (*p < 0.01; ns: no significant difference). Figure 4 panels A and B use reproduced images from the Action Database – Keefe et al., 2014, with permission from the authors (copyright year: 2014, copyright holder: Keefe et al.).
Figure 5.
Figure 5.. Neural correlates of expression processing in a small group of MBS individuals relative to controls.
A. fMRI activity plots showing no difference in BOLD percent signal change in right FFA during identity (solid) vs. expression (patterned) matching for Controls (N = 15) in blue and MBS individuals (N = 6) in red. B. fMRI activity plots showing significantly higher BOLD percent signal change in right pSTS for expression compared to identity matching for Controls (*p < 0.01) but not MBS (p = 0.06). C. BOLD percent signal change in right amygdala was not significantly different during expression matching relative to identity matching for Controls or MBS, although numerically the two groups showed opposite patterns (Controls: expression > identity while MBS: expression < identity).

Similar articles

Cited by

References

    1. Ahmed A. (2005). When is facial paralysis Bell palsy? Current diagnosis and treatment. Cleve Clin J Med, 72(5), 398–401, 405. 10.3949/ccjm.72.5.398 - DOI - PubMed
    1. Bagby RM, Parker JD, & Taylor GJ (1994). The twenty-item Toronto Alexithymia Scale--I. Item selection and cross-validation of the factor structure. J Psychosom Res, 38(1), 23–32. 10.1016/0022-3999(94)90005-1 - DOI - PubMed
    1. Bate S, Cook SJ, Mole J, & Cole J (2013). First report of generalized face processing difficulties in mobius sequence. PLoS One, 8(4), e62656. 10.1371/journal.pone.0062656 - DOI - PMC - PubMed
    1. Becerra R, Preece D, Campitelli G, & Scott-Pillow G (2019). The Assessment of Emotional Reactivity Across Negative and Positive Emotions: Development and Validation of the Perth Emotional Reactivity Scale (PERS). Assessment, 26(5), 867–879. 10.n77/1073191117694455 - DOI - PubMed
    1. Bogart K, Tickle-Degnen L, & Ambady N (2014). Communicating without the Face: Holistic Perception of Emotions of People with Facial Paralysis. Basic Appl Soc Psych, 36(4), 309–320. 10.1080/01973533.2014.917973 - DOI - PMC - PubMed

Publication types

Associated data

LinkOut - more resources