Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Apr 26;119(17):e2115228119.
doi: 10.1073/pnas.2115228119. Epub 2022 Apr 21.

Deep models of superficial face judgments

Affiliations

Deep models of superficial face judgments

Joshua C Peterson et al. Proc Natl Acad Sci U S A. .

Abstract

The diversity of human faces and the contexts in which they appear gives rise to an expansive stimulus space over which people infer psychological traits (e.g., trustworthiness or alertness) and other attributes (e.g., age or adiposity). Machine learning methods, in particular deep neural networks, provide expressive feature representations of face stimuli, but the correspondence between these representations and various human attribute inferences is difficult to determine because the former are high-dimensional vectors produced via black-box optimization algorithms. Here we combine deep generative image models with over 1 million judgments to model inferences of more than 30 attributes over a comprehensive latent face space. The predictive accuracy of our model approaches human interrater reliability, which simulations suggest would not have been possible with fewer faces, fewer judgments, or lower-dimensional feature representations. Our model can be used to predict and manipulate inferences with respect to arbitrary face photographs or to generate synthetic photorealistic face stimuli that evoke impressions tuned along the modeled attributes.

Keywords: computational models; face perception; social traits.

PubMed Disclaimer

Conflict of interest statement

Competing interest statement: All authors are listed as inventors on a related patent (US Patent no. 11,250,245, “Data-driven, photorealistic social face-trait encoding, prediction, and manipulation using deep neural networks”).

Figures

Fig. 1.
Fig. 1.
Correlation matrix for 34 average attribute ratings for each of 1,000 faces. Rows and columns are arranged according to a hierarchical clustering of the correlation values.
Fig. 2.
Fig. 2.
Average cross-validated model performance (black bars) compared to intersubject reliability (red markers).
Fig. 3.
Fig. 3.
Model performance (R2) for each attribute as a function of the number of face examples (Top), the number of participant ratings for each face example (Middle), and the number of image feature dimensions (Bottom). Attributes are ordered by the maximum model performance observed in Top.
Fig. 4.
Fig. 4.
(A) The faces judged on average to have the highest and lowest ratings along six sample perceived attribute dimensions. (B) Model-based manipulations of two sample base faces along the sample dimensions, demonstrating smooth and effective manipulations along each attribute.

References

    1. Farzin F., Hou C., Norcia A. M., Piecing it together: Infants’ neural responses to face and object structure. J. Vis. 12, 6–6 (2012). - PMC - PubMed
    1. Kanwisher N., McDermott J., Chun M. M., The fusiform face area: A module in human extrastriate cortex specialized for face perception. J. Neurosci. 17, 4302–4311 (1997). - PMC - PubMed
    1. Frith C., Role of facial expressions in social interactions. Philos. Trans. R. Soc. Lond. B Biol. Sci. 364, 3453–3458 (2009). - PMC - PubMed
    1. Oosterhof N. N., Todorov A., The functional basis of face evaluation. Proc. Natl. Acad. Sci. U.S.A. 105, 11087–11092 (2008). - PMC - PubMed
    1. Sutherland C. A., et al. ., Social inferences from faces: Ambient images generate a three-dimensional model. Cognition 127, 105–118 (2013). - PubMed

Publication types

LinkOut - more resources