Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2023 Aug 25;4(3):550-562.
doi: 10.1007/s42761-023-00215-z. eCollection 2023 Sep.

Advancing Naturalistic Affective Science with Deep Learning

Affiliations
Review

Advancing Naturalistic Affective Science with Deep Learning

Chujun Lin et al. Affect Sci. .

Abstract

People express their own emotions and perceive others' emotions via a variety of channels, including facial movements, body gestures, vocal prosody, and language. Studying these channels of affective behavior offers insight into both the experience and perception of emotion. Prior research has predominantly focused on studying individual channels of affective behavior in isolation using tightly controlled, non-naturalistic experiments. This approach limits our understanding of emotion in more naturalistic contexts where different channels of information tend to interact. Traditional methods struggle to address this limitation: manually annotating behavior is time-consuming, making it infeasible to do at large scale; manually selecting and manipulating stimuli based on hypotheses may neglect unanticipated features, potentially generating biased conclusions; and common linear modeling approaches cannot fully capture the complex, nonlinear, and interactive nature of real-life affective processes. In this methodology review, we describe how deep learning can be applied to address these challenges to advance a more naturalistic affective science. First, we describe current practices in affective research and explain why existing methods face challenges in revealing a more naturalistic understanding of emotion. Second, we introduce deep learning approaches and explain how they can be applied to tackle three main challenges: quantifying naturalistic behaviors, selecting and manipulating naturalistic stimuli, and modeling naturalistic affective processes. Finally, we describe the limitations of these deep learning methods, and how these limitations might be avoided or mitigated. By detailing the promise and the peril of deep learning, this review aims to pave the way for a more naturalistic affective science.

Keywords: Affective science; Cognitive modeling; Deep learning; Generalizability; Person perception.

PubMed Disclaimer

Conflict of interest statement

Conflict of InterestOn behalf of all authors, the corresponding author states that there is no conflict of interest.

Figures

Fig. 1
Fig. 1
The structure and training process of a DNN. A. The basic components of a DNN. B. The computations performed inside a neuron. C. The training process for minimizing loss (prediction error) using stochastic gradient descent via backpropagation
Fig. 2
Fig. 2
Applications of DNNs for advancing naturalistic affective research. A. DNNs provide a more scalable way to quantify behavior of study participants and stimulus targets in naturalistic contexts. B. DNN-based quantifications can support better experimentation by facilitating naturalistic stimulus selection and manipulation. C. DNNs are capable of capturing interactive and nonlinear effects, ideal for modeling cognitive/neural mechanisms underlying the subjective experience, physiological responses, and the recognition and expressions of emotions
Fig. 3
Fig. 3
Improvements of DNNs for behavioral quantification over time. Title indicates the behavior channel and the corresponding benchmark dataset that the models were evaluated on. X-axis indicates the year the models were published. Y-axis indicates the metric for measuring model performance. Data reflect benchmarks reported on paperswithcode.com (Papers with Code, n.d.)

Similar articles

Cited by

References

    1. Arias P, Rachman L, Liuni M, Aucouturier J-J. Beyond correlation: Acoustic transformation methods for the experimental study of emotional voice and speech. Emotion Review. 2021;13(1):12–24. doi: 10.1177/1754073920934544. - DOI
    1. Arulkumaran K, Deisenroth MP, Brundage M, Bharath AA. Deep reinforcement learning: A brief survey. IEEE Signal Processing Magazine. 2017;34(6):26–38. doi: 10.1109/MSP.2017.2743240. - DOI
    1. Atkinson AP, Dittrich WH, Gemmell AJ, Young AW. Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception. 2004;33(6):717–746. doi: 10.1068/p5096. - DOI - PubMed
    1. Aviezer H, Hassin RR, Ryan J, Grady C, Susskind J, Anderson A, Moscovitch M, Bentin S. Angry, disgusted, or afraid?: Studies on the malleability of emotion perception. Psychological Science. 2008;19(7):724–732. doi: 10.1111/j.1467-9280.2008.02148.x. - DOI - PubMed
    1. Aviezer H, Trope Y, Todorov A. Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science. 2012;338(6111):1225–1229. doi: 10.1126/science.1224313. - DOI - PubMed

LinkOut - more resources