Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2019 Jul;20(1):1-68.
doi: 10.1177/1529100619832930.

Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements

Affiliations
Review

Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements

Lisa Feldman Barrett et al. Psychol Sci Public Interest. 2019 Jul.

Erratum in

Abstract

It is commonly assumed that a person's emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require.

Keywords: emotion perception; emotion recognition; emotional expression.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.. Explanatory frameworks guiding the science of emotion: The nature of emotion categories and their concepts.
Figure is plotted along two dimensions. Horizontal: represents hypotheses about the surface similarities shared by instances of the same emotion category (e.g., the facial movements that express instances of the same emotion category). Vertical: represents hypotheses about the deep similarities in the mechanisms that cause instances of the same emotion category (e.g., to what extent do instances in the same category share deep, causal features?). Colors represent the type of emotion categories that are proposed in each theoretical framework (green = ad hoc, abstract categories; yellow = prototype or theory-based categories; red = natural kind categories).
Figure 2.
Figure 2.. Example figures from recently published papers that reinforce the common belief in diagnostic facial expressions of emotion.
A. Adapted from Cordaro et al. (in press), Table 1, with permission. Face photos © Dr. Lenny Kristal. B. Shariff & Tracy, 2011, Figure 2, with permission.
Figure 2.
Figure 2.. Example figures from recently published papers that reinforce the common belief in diagnostic facial expressions of emotion.
A. Adapted from Cordaro et al. (in press), Table 1, with permission. Face photos © Dr. Lenny Kristal. B. Shariff & Tracy, 2011, Figure 2, with permission.
Figure 3.
Figure 3.. Evaluation criteria: Reliability and specificity in relation to forward and reverse inference.
Anger and fear are used as the example categories.
Figure 4.
Figure 4.. Facial action ensembles for commonsense facial configurations.
Facial action coding system (FACS) codes that correspond to the commonsense expressive configuration in adults. A is proposed expression for anger and corresponds to prescribed EMFACS code for anger (AUs 4, 5, 7, and 23). B is proposed expression for disgust and corresponds to prescribed EMFACS code for disgust (AU 10). C is proposed expression for fear and corresponds to prescribed EMFACS code for fear (AUs 1, 2, and 5 or 5 and 20). D is proposed expression for happiness and corresponds to prescribed EMFACS code for the so-called Duchenne smile (AUs 6 and 12). E is proposed expression for sadness and corresponds to prescribed EMFACS code for sadness (AUs 1, 4, 11 and 15 or 1, 4, 15 and 17). F is proposed expression for surprise and corresponds to prescribed EMFACS code for surprise (AUs 1, 2, 5, and 26). It was originally proposed that infants express emotions with the same facial configurations as adults. Later research revealed morphological differences between the proposed expressive configurations for adults and infants. Only three out of a possible nineteen proposed configurations for negative emotions from the infant coding scheme were the same as the configurations proposed for adults (Oster et al., 1992). G. adapted from Cordaro et al. (in press), Table 1, with permission. Face photos © Dr. Lenny Kristal. H. adapted from Shariff & Tracy, 2011, Figure 2, with permission.
Figure 5.
Figure 5.. Meta-analysis of facial movements during emotional episodes: A summary of effect sizes across studies (Duran et al., 2017).
Effect sizes are computed as correlations or proportions (as reported in the original experiments). Results include experiments that reported a correspondence between a facial configuration and its hypothesized emotion category and those that reported a correspondence between individual AUs of that facial configuration and the relevant emotion category; meta-analytic summaries for entire ensembles of AUs only (the facial configurations specified in Figure 2) were even lower than those that appear here.
Figure 6:
Figure 6:. Comparing posed and spontaneous facial movements.
Results from Table 6, Cordaro et al. (2017), degree of overlap between the hypothesized configuration of facial movements for each emotion category and the “International Core Patterns” derived from participants’ expressive poses; Gabonese participants in Elfenbein et al. (2007), reliability for the anger category is for AU4 + AU5 only; proportion data only from Duran et al., (2017).
Figure 7.
Figure 7.. Examples of virtual humans. Virtual humans are software-based artifacts that look like and act like people.
(A) Feng et al, 2017; (B) Zoll et al., 2006; (C) Hoyt et al., 2003; (D) Marsella et al. 2000.
Figure 8
Figure 8. Emotion perception findings.
(A) Average effect sizes for perceptions of facial configurations from Elfenbein & Ambady (2002), in which 95% of the articles summarized used choice-from-array to measure participants’ emotion inferences. (B) Free-labeling of facial configurations across five language groups from Srinivasan & Martinez (2018). IDs chosen represent the best match to the commonsense facial configurations in Figure 4 based on AUs present. No configuration discovered in this study exactly match the AU configurations proposed by Darwin or documented in prior research. Proportion of times participants offered emotion category labels (or their synonyms) are reported. According to standard scientific criteria, universal expressions of emotion should elicit agreement rates that are considerably higher than those reported here, generally in the 70 ± 90% range, even when methodological constraints are relaxed (Haidt & Keltner, 1999). Specificity data were not available for the Elfenbein & Ambady (2002) meta-analysis.
Figure 9.
Figure 9.. Map of cross-cultural studies of emotion perception in small-scale societies.
People in small scale societies typically live in groupings of several hundred to several thousand that maintain autonomy in social, political and economic spheres. (A). Epoch 1 studies, published between 1969 and 1975, were geographically constrained to societies in the South Pacific. (B). Epoch 2 studies, published between 2008 and 2017, sample from a broader geographic range including Africa and South America, and are more diverse in the ecological and social contexts of the societies tested. This type of diversity is a necessary condition for discovering the extent of cultural variation in psychological phenomena (Medin et al., 2017). Reproduced with permission from Gendron et al. (2018).
Figure 9.
Figure 9.. Map of cross-cultural studies of emotion perception in small-scale societies.
People in small scale societies typically live in groupings of several hundred to several thousand that maintain autonomy in social, political and economic spheres. (A). Epoch 1 studies, published between 1969 and 1975, were geographically constrained to societies in the South Pacific. (B). Epoch 2 studies, published between 2008 and 2017, sample from a broader geographic range including Africa and South America, and are more diverse in the ecological and social contexts of the societies tested. This type of diversity is a necessary condition for discovering the extent of cultural variation in psychological phenomena (Medin et al., 2017). Reproduced with permission from Gendron et al. (2018).

Comment in

References

    1. Adolphs R (2002). Neural mechanisms for recognizing emotions. Current Opinion in Neurobiology 12, 169–178. - PubMed
    1. Adolphs R (2017). How should neuroscience study emotions? by distinguishing emotion states, concepts, and experiences. Soc Cogn Affect Neurosci, 12(1), 24–31. doi: 10.1093/scan/nsw153 - DOI - PMC - PubMed
    1. Adolphs R, Tranel D, Damasio H, and Damasio A (1994). Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala. Nature, 372, 669–672. - PubMed
    1. Arya Ali, DiPaola Steve, and Parush Avi, “Perceptually Valid facial Expressions for Character-Based Applications,” International Journal of Computer Games Technology, vol. 2009, Article ID 462315, 13 pages, 2009.
    1. Ambadar Z, Schooler JW, & Cohn JF (2005). Deciphering the enigmatic face: The importance of facial dynamics in interpreting subtle facial expressions. Psychological Science, 16, 403–410. - PubMed