EmotionMeter: A Multimodal Framework for Recognizing Human Emotions
- PMID: 29994384
- DOI: 10.1109/TCYB.2018.2797176
EmotionMeter: A Multimodal Framework for Recognizing Human Emotions
Abstract
In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility and wearability of EmotionMeter in real-world applications, we design a six-electrode placement above the ears to collect electroencephalography (EEG) signals. We combine EEG and eye movements for integrating the internal cognitive states and external subconscious behaviors of users to improve the recognition accuracy of EmotionMeter. The experimental results demonstrate that modality fusion with multimodal deep neural networks can significantly enhance the performance compared with a single modality, and the best mean accuracy of 85.11% is achieved for four emotions (happy, sad, fear, and neutral). We explore the complementary characteristics of EEG and eye movements for their representational capacities and identify that EEG has the advantage of classifying happy emotion, whereas eye movements outperform EEG in recognizing fear emotion. To investigate the stability of EmotionMeter over time, each subject performs the experiments three times on different days. EmotionMeter obtains a mean recognition accuracy of 72.39% across sessions with the six-electrode EEG and eye movement features. These experimental results demonstrate the effectiveness of EmotionMeter within and between sessions.
Similar articles
-
Multimodal Emotion Recognition from Eye Image, Eye Movement and EEG Using Deep Neural Networks.Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:3071-3074. doi: 10.1109/EMBC.2019.8856563. Annu Int Conf IEEE Eng Med Biol Soc. 2019. PMID: 31946536
-
[Dynamic continuous emotion recognition method based on electroencephalography and eye movement signals].Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. 2025 Feb 25;42(1):32-41. doi: 10.7507/1001-5515.202408013. Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. 2025. PMID: 40000173 Free PMC article. Chinese.
-
Cross-Modal Guiding Neural Network for Multimodal Emotion Recognition From EEG and Eye Movement Signals.IEEE J Biomed Health Inform. 2024 Oct;28(10):5865-5876. doi: 10.1109/JBHI.2024.3419043. Epub 2024 Oct 3. IEEE J Biomed Health Inform. 2024. PMID: 38917288
-
Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals.Neuroimage. 2014 Nov 15;102 Pt 1:162-72. doi: 10.1016/j.neuroimage.2013.11.007. Epub 2013 Nov 20. Neuroimage. 2014. PMID: 24269801 Review.
-
Emotion recognition from physiological signals.J Med Eng Technol. 2011 Aug-Oct;35(6-7):300-7. doi: 10.3109/03091902.2011.601784. J Med Eng Technol. 2011. PMID: 21936746 Review.
Cited by
-
MISNet: multi-source information-shared EEG emotion recognition network with two-stream structure.Front Neurosci. 2024 Feb 14;18:1293962. doi: 10.3389/fnins.2024.1293962. eCollection 2024. Front Neurosci. 2024. PMID: 38419660 Free PMC article.
-
Wearable Emotion Recognition Using Heart Rate Data from a Smart Bracelet.Sensors (Basel). 2020 Jan 28;20(3):718. doi: 10.3390/s20030718. Sensors (Basel). 2020. PMID: 32012920 Free PMC article.
-
Recognizing emotions induced by wearable haptic vibration using noninvasive electroencephalogram.Front Neurosci. 2023 Jul 6;17:1219553. doi: 10.3389/fnins.2023.1219553. eCollection 2023. Front Neurosci. 2023. PMID: 37483356 Free PMC article.
-
Multi-Input CNN-LSTM deep learning model for fear level classification based on EEG and peripheral physiological signals.Front Psychol. 2023 Jun 1;14:1141801. doi: 10.3389/fpsyg.2023.1141801. eCollection 2023. Front Psychol. 2023. PMID: 37325747 Free PMC article.
-
MP: A steady-state visual evoked potential dataset based on multiple paradigms.iScience. 2024 Sep 25;27(11):111030. doi: 10.1016/j.isci.2024.111030. eCollection 2024 Nov 15. iScience. 2024. PMID: 39759080 Free PMC article.
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources