Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2018 Jun 28;18(7):2074.
doi: 10.3390/s18072074.

A Review of Emotion Recognition Using Physiological Signals

Affiliations
Review

A Review of Emotion Recognition Using Physiological Signals

Lin Shu et al. Sensors (Basel). .

Abstract

Emotion recognition based on physiological signals has been a hot topic and applied in many areas such as safe driving, health care and social security. In this paper, we present a comprehensive review on physiological signal-based emotion recognition, including emotion models, emotion elicitation methods, the published emotional physiological datasets, features, classifiers, and the whole framework for emotion recognition based on the physiological signals. A summary and comparation among the recent studies has been conducted, which reveals the current existing problems and the future work has been discussed.

Keywords: classifiers; emotion model; emotion recognition; emotion stimulation; features; physiological signals.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Figure 1
Figure 1
Plutchik’s Wheel of Emotions.
Figure 2
Figure 2
2D emotion space model.
Figure 3
Figure 3
3D emotion space model.
Figure 4
Figure 4
(a) MAUI—Multimodal Affective User Interface; (b) Framework of AVRS; (c) VR scenes cut show.
Figure 5
Figure 5
Position of the bio-sensors.
Figure 6
Figure 6
(a) Mean accuracy of different channels; (b) The performance of different windows sizes; (c) The average accuracies of GELM; (d) Spectrogram shows different patterns with different emotions.
Figure 7
Figure 7
(a) Logical scheme of the overall short-time emotion recognition concept; (b) Instantaneous tracking of the HR V indices computed from a representative subject using the proposed NARI model during the passive emotional elicitation (two neutral sessions alternated to a L-M and a M-H arousal session); (c) Diagram of the proposed method; (d) Experimental results.
Figure 8
Figure 8
(a) The Emotion Check device; (b) Diagram describing the components of the Emotion Check device; (c) Prototype of glove with sensor unit; (d) Body Media Sense Wear Armband; (e) Left: The physiological measures of EMG and EDA. Middle: The physiological measures of EEG, BVP and TMP. Right: The physiological measures of physiological sensors in the experiments; (g) Illustration of R-TIPS. This platform allows wireless monitoring of cardiac signals. It consists of a transmitter system and three sensors; (f) The transmitter system is placed on the participant’s hip, and the sensors are placed below right breast, on the right side, and on the back.
Figure 9
Figure 9
(a) Monitoring of epileptic seizures using EDA; (b,c) Wearable GSR sensor.
Figure 10
Figure 10
Emotion recognition process using physiological signals under target emotion stimulation.
Figure 11
Figure 11
(a) The decomposition of R-R interval signal (emotion of sadness); (b) The structure of Autoencoder; (c) The structure of Bimodal Deep AutoEncoder.
Figure 12
Figure 12
(a) Typical framework of multimodal information fusion; (b) SVM results for different emotions with EEG frequency band; (c) Demo of the proposed feature level fusion. A feature vector created at any time step is valid for the next two steps.
Figure 13
Figure 13
Classification models.
Figure 14
Figure 14
(a) The structure of standard RNN and LSTM; (b) The structure and settings of CNN.
Figure 15
Figure 15
ROC.
Figure 16
Figure 16
Comparative results on the same public accessible datasets: (a) DEAP; (b) MAHNOB; database; (c) SEED.
Figure 17
Figure 17
The comparation of recognition rate among previous research.
Figure 18
Figure 18
Subject-dependent and Subject-independent recognition rate. (The horizontal axis represents the same sequence number as the Table 3. Different colors represent different classification categories: Blue—two categories, yellow—three categories, green four—categories, grey—five categories, purple—six categories).

Similar articles

Cited by

References

    1. De Nadai S., D’Incà M., Parodi F., Benza M., Trotta A., Zero E., Zero L., Sacile R. Enhancing safety of transport by road by on-line monitoring of driver emotions; Proceedings of the 2016 11th System of Systems Engineering Conference (SoSE); Kongsberg, Norway. 12–16 June 2016; pp. 1–4.
    1. Guo R., Li S., He L., Gao W., Qi H., Owens G. Pervasive and unobtrusive emotion sensing for human mental health; Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare; Venice, Italy. 5–8 May 2013; pp. 436–439.
    1. Verschuere B., Crombez G., Koster E., Uzieblo K. Psychopathy and Physiological Detection of Concealed Information: A review. Psychol. Belg. 2006;46:99–116. doi: 10.5334/pb-46-1-2-99. - DOI
    1. Zhang Y.D., Yang Z.J., Lu H.M., Zhou X.X., Phillips P., Liu Q.M., Wang S.H. Facial Emotion Recognition based on Biorthogonal Wavelet Entropy, Fuzzy Support Vector Machine, and Stratified Cross Validation. IEEE Access. 2016;4:8375–8385. doi: 10.1109/ACCESS.2016.2628407. - DOI
    1. Mao Q., Dong M., Huang Z., Zhan Y. Learning Salient Features for Speech Emotion Recognition Using Convolutional Neural Networks. IEEE Trans. Multimed. 2014;16:2203–2213. doi: 10.1109/TMM.2014.2360798. - DOI