Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Jun 29;21(7):646.
doi: 10.3390/e21070646.

Emotion Recognition from Skeletal Movements

Affiliations

Emotion Recognition from Skeletal Movements

Tomasz Sapiński et al. Entropy (Basel). .

Abstract

Automatic emotion recognition has become an important trend in many artificial intelligence (AI) based applications and has been widely explored in recent years. Most research in the area of automated emotion recognition is based on facial expressions or speech signals. Although the influence of the emotional state on body movements is undeniable, this source of expression is still underestimated in automatic analysis. In this paper, we propose a novel method to recognise seven basic emotional states-namely, happy, sad, surprise, fear, anger, disgust and neutral-utilising body movement. We analyse motion capture data under seven basic emotional states recorded by professional actor/actresses using Microsoft Kinect v2 sensor. We propose a new representation of affective movements, based on sequences of body joints. The proposed algorithm creates a sequential model of affective movement based on low level features inferred from the spacial location and the orientation of joints within the tracked skeleton. In the experimental results, different deep neural networks were employed and compared to recognise the emotional state of the acquired motion sequences. The experimental results conducted in this work show the feasibility of automatic emotion recognition from sequences of body gestures, which can serve as an additional source of information in multimodal emotion recognition.

Keywords: Kinect sensor; body movements; deep learning; emotion recognition; gestures; neural networks.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
The structure of proposed emotional gestures expression recognition approach.
Figure 2
Figure 2
Selected frames of actor/actress’ poses in six basic emotions: fear, surprise, anger, sadness, happiness, disgust.
Figure 3
Figure 3
(a) Skeleton mapping in relation to the human body [33]. (b) An example frame of Kinect recording showing the skeleton.
Figure 4
Figure 4
Sequence of three key frames extracted from point cloud data representing happiness.
Figure 5
Figure 5
Heat-map presenting distribution of joints involvement for particular emotional state (a) for all joints (b) excluding hands.
Figure 6
Figure 6
(a) The process of using a Convolutional Neural Network (CNN) for gestures-based emotion recognition shows the process of creating an matrices based on motion sequence. (b) The process of using a Recurrent Neural Network (RNN) for motion sequence analysis—each time step of the motion sequence is evaluated by a RNN.
Figure 7
Figure 7
Confusion matrix for (a) CNN on P set with 3 cm error rate (b) RNN on P set with 3 cm error rate (c) RNN-LSTM on P set with 3 cm error rate. Seven emotional states: Ne—neutral, Sa—sadness, Su—surprise, Fe—fear, An—anger, Di—disgust, Ha—happiness.
Figure 8
Figure 8
Confusion matrices for (a) CNN on P set with 3 cm error rate (b) RNN on P set with 3 cm error rate (c) RNN-LSTM on P set with 3 cm error rate. Six emotional states: Fe—fear, Ha—happiness, Sa—sadness, Su—surprise, An—anger, Di—disgust.
Figure 9
Figure 9
Confusion matrices for (a) CNN on P set with 3 cm error rate (b) RNN on P set with 3 cm error rate (c) RNN-LSTM on P set with 3 cm error rate. Four emotional states: Fe—fear, Ha—happiness, Sa—sadness, An—anger.

References

    1. Ekman P. Facial action coding system (FACS) [(accessed on 28 June 2019)];A Human Face. 2002 Available online: https://www.cs.cmu.edu/~face/facs.htm.
    1. Pease A., McIntosh J., Cullen P. Body Language. Malor Books; Los Altos, CA, USA: 1981. Camel.
    1. Izdebski K. Emotions in the Human Voice, Volume 3: Culture and Perception. Volume 3 Plural Publishing; San Diego, CA, USA: 2008.
    1. Kim J., André E. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 2008;30:2067–2083. doi: 10.1109/TPAMI.2008.26. - DOI - PubMed
    1. Ekman P. Emotions Revealed: Understanding Faces and Feelings. Hachette; London, UK: 2012.

LinkOut - more resources