Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 Sep 15:4:e2364.
doi: 10.7717/peerj.2364. eCollection 2016.

Emotion recognition using Kinect motion capture data of human gaits

Affiliations

Emotion recognition using Kinect motion capture data of human gaits

Shun Li et al. PeerJ. .

Abstract

Automatic emotion recognition is of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Human gaits could reflect the walker's emotional state, and could be an information source for emotion recognition. This paper proposed a novel method to recognize emotional state through human gaits by using Microsoft Kinect, a low-cost, portable, camera-based sensor. Fifty-nine participants' gaits under neutral state, induced anger and induced happiness were recorded by two Kinect cameras, and the original data were processed through joint selection, coordinate system transformation, sliding window gauss filtering, differential operation, and data segmentation. Features of gait patterns were extracted from 3-dimentional coordinates of 14 main body joints by Fourier transformation and Principal Component Analysis (PCA). The classifiers NaiveBayes, RandomForests, LibSVM and SMO (Sequential Minimal Optimization) were trained and evaluated, and the accuracy of recognizing anger and happiness from neutral state achieved 80.5% and 75.4%. Although the results of distinguishing angry and happiness states were not ideal in current study, it showed the feasibility of automatically recognizing emotional states from gaits, with the characteristics meeting the application requirements.

Keywords: Affective computing; Emotion recognition; Gait; Kinect; Machine learning.

PubMed Disclaimer

Conflict of interest statement

The authors declare there are no competing interests.

Figures

Figure 1
Figure 1. The experiment scene.
Figure 2
Figure 2. The schematic of the experiment environment.
Figure 3
Figure 3. The procedures of the first round experiment.
Figure 4
Figure 4. Stick figure and location of body joint centers recorded by Kinect.

References

    1. Alm CO, Roth D, Sproat R. Emotions from text: machine learning for text-based emotion prediction. Proceedings of the conference on human language technology and empirical methods in natural language processing; Association for Computational Linguistics; 2005. pp. 579–586.
    1. Atkinson AP, Dittrich WH, Gemmell AJ, Young AW. Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception. 2004;33(6):717–746. doi: 10.1068/p5096. - DOI - PubMed
    1. Auvinet E, Multon F, Aubin C-E, Meunier J, Raison M. Detection of gait cycles in treadmill walking using a Kinect. Gait & Posture. 2015;41(2):722–725. doi: 10.1016/j.gaitpost.2014.08.006. - DOI - PubMed
    1. Barakova EI, Lourens T. Expressing and interpreting emotional movements in social games with robots. Personal and Ubiquitous Computing. 2010;14(5):457–467. doi: 10.1007/s00779-009-0263-2. - DOI
    1. Cabanac M. What is emotion? Behavioural Processes. 2002;60(2):69–83. doi: 10.1016/S0376-6357(02)00078-5. - DOI - PubMed

LinkOut - more resources