Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2025 Feb 25;42(1):32-41.
doi: 10.7507/1001-5515.202408013.

[Dynamic continuous emotion recognition method based on electroencephalography and eye movement signals]

[Article in Chinese]
Affiliations

[Dynamic continuous emotion recognition method based on electroencephalography and eye movement signals]

[Article in Chinese]
Yangmeng Zou et al. Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. .

Abstract

Existing emotion recognition research is typically limited to static laboratory settings and has not fully handle the changes in emotional states in dynamic scenarios. To address this problem, this paper proposes a method for dynamic continuous emotion recognition based on electroencephalography (EEG) and eye movement signals. Firstly, an experimental paradigm was designed to cover six dynamic emotion transition scenarios including happy to calm, calm to happy, sad to calm, calm to sad, nervous to calm, and calm to nervous. EEG and eye movement data were collected simultaneously from 20 subjects to fill the gap in current multimodal dynamic continuous emotion datasets. In the valence-arousal two-dimensional space, emotion ratings for stimulus videos were performed every five seconds on a scale of 1 to 9, and dynamic continuous emotion labels were normalized. Subsequently, frequency band features were extracted from the preprocessed EEG and eye movement data. A cascade feature fusion approach was used to effectively combine EEG and eye movement features, generating an information-rich multimodal feature vector. This feature vector was input into four regression models including support vector regression with radial basis function kernel, decision tree, random forest, and K-nearest neighbors, to develop the dynamic continuous emotion recognition model. The results showed that the proposed method achieved the lowest mean square error for valence and arousal across the six dynamic continuous emotions. This approach can accurately recognize various emotion transitions in dynamic situations, offering higher accuracy and robustness compared to using either EEG or eye movement signals alone, making it well-suited for practical applications.

现有情绪识别研究通常局限于静态实验室情境,未能充分解决动态情境下情绪状态的变化问题。为此,本文提出了一种基于脑电和眼动信号的动态连续情绪识别方法。首先,设计了一种试验范式以涵盖6种动态情绪转换情境,包括开心至平静、平静至开心、悲伤至平静、平静至悲伤、紧张至平静以及平静至紧张,并同步收集了20名受试者的脑电数据和眼动数据,以填补现有多模态动态连续情绪数据集的空白。在效价和唤醒二维空间中,对刺激影片每5 s进行分值为1~9的情绪评分,并对动态连续情绪标签进行归一化。接着,对预处理后的脑电和眼动数据进行分频段特征提取,并采用级联特征融合方法,将脑电和眼动数据特征有效结合,生成信息丰富的多模态特征向量。该特征向量被输入到基于径向基函数核的支持向量回归、决策树、随机森林和 K-最近邻四种回归模型中,用于构建动态连续情绪识别模型。结果表明,本文方法在6种动态连续情绪的愉悦度和唤醒度识别中均取得了最低均方误差,在动态情境下能够精准识别多种情绪转换,而且相较于单一的脑电或眼动信号,具有更高的识别准确率和鲁棒性,能够满足实际应用的需求。.

Keywords: Dynamic continuous emotion; Electroencephalography signal; Emotion recognition model; Eye movement; Multimodal feature fusion.

PubMed Disclaimer

Conflict of interest statement

利益冲突声明:本文全体作者均声明不存在利益冲突。

Figures

图 1
图 1
Experimental paradigm 试验范式
图 2
图 2
Framework diagram of dynamic continuous emotion recognition model based on EEG and EM signals 基于EEG和EM信号的动态连续情绪识别模型框架图
图 3
图 3
Dynamic continuous emotion curve 动态连续情绪曲线图
图 4
图 4
Dynamic continuous emotion recognition results of EEG signals in different frequency bands 不同频段EEG信号的动态连续情绪识别结果
图 5
图 5
MSE results of EM signal features dynamic continuous emotion recognition EM信号特征动态连续情绪识别MSE结果

References

    1. Phelps E A Emotion and cognition: insights from studies of the human amygdala. Annual Review of Psychology. 2006;57:27–53. doi: 10.1146/annurev.psych.56.091103.070234. - DOI - PubMed
    1. Koelstra S, Mühl C, Soleymani M, et al DEAP: a database for emotion analysis; using physiological signals. IEEE Transactions on Affective Computing. 2011;3(1):18–31.
    1. Katsigiannis S, Ramzan N DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE Journal of Biomedical and Health Informatics. 2018;22(1):98–107. doi: 10.1109/JBHI.2017.2688239. - DOI - PubMed
    1. Cheng J, Chen M, Li C, et al Emotion recognition from multi-channel EEG via deep forest. IEEE Journal of Biomedical and Health Informatics. 2021;25(2):453–464. doi: 10.1109/JBHI.2020.2995767. - DOI - PubMed
    1. Liu Y, Ding Y, Li C, et al Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network. Computers in Biology and Medicine. 2020;123:103927. doi: 10.1016/j.compbiomed.2020.103927. - DOI - PubMed

Publication types

LinkOut - more resources