Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Aug 23;13(1):13804.
doi: 10.1038/s41598-023-40786-2.

Improved EEG-based emotion recognition through information enhancement in connectivity feature map

Affiliations

Improved EEG-based emotion recognition through information enhancement in connectivity feature map

M A H Akhand et al. Sci Rep. .

Abstract

Electroencephalography (EEG), despite its inherited complexity, is a preferable brain signal for automatic human emotion recognition (ER), which is a challenging machine learning task with emerging applications. In any automatic ER, machine learning (ML) models classify emotions using the extracted features from the EEG signals, and therefore, such feature extraction is a crucial part of ER process. Recently, EEG channel connectivity features have been widely used in ER, where Pearson correlation coefficient (PCC), mutual information (MI), phase-locking value (PLV), and transfer entropy (TE) are well-known methods for connectivity feature map (CFM) construction. CFMs are typically formed in a two-dimensional configuration using the signals from two EEG channels, and such two-dimensional CFMs are usually symmetric and hold redundant information. This study proposes the construction of a more informative CFM that can lead to better ER. Specifically, the proposed innovative technique intelligently combines CFMs' measures of two different individual methods, and its outcomes are more informative as a fused CFM. Such CFM fusion does not incur additional computational costs in training the ML model. In this study, fused CFMs are constructed by combining every pair of methods from PCC, PLV, MI, and TE; and the resulting fused CFMs PCC + PLV, PCC + MI, PCC + TE, PLV + MI, PLV + TE, and MI + TE are used to classify emotion by convolutional neural network. Rigorous experiments on the DEAP benchmark EEG dataset show that the proposed CFMs deliver better ER performances than CFM with a single connectivity method (e.g., PCC). At a glance, PLV + MI-based ER is shown to be the most promising one as it outperforms the other methods.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
The framework of the proposed emotion recognition system from EEG.
Figure 2
Figure 2
Heatmap representation of sample connectivity feature maps (CFMs) constructed with individual methods. Seaborn library of Python (https://seaborn.pydata.org/) is used to generate heatmap.
Figure 3
Figure 3
Heatmap representation of sample fused connectivity feature maps (CFMs) constructed with every two different methods. Seaborn library of Python (https://seaborn.pydata.org/) is used to generate heatmap.
Figure 4
Figure 4
CNN architecture (with dimension of individual layers) to classify emotions from CFMs.
Figure 5
Figure 5
Model loss and accuracy for Valence classification using CFMs with individual connectivity method.
Figure 6
Figure 6
Valence and Arousal classification accuracies with individual connectivity methods in different frequency sub-bands and full frequency band.
Figure 7
Figure 7
Model loss and accuracy for Valence classification using fused CFMs with two connectivity methods.
Figure 8
Figure 8
Test set accuracies for CFM with individual methods and fused CFMs in Valence and Arousal classification for Gamma sub-band.
Figure 9
Figure 9
Subject-dependent and subject-independent test set classification accuracies with fused CFM with PLV + MI.
Figure 10
Figure 10
CNN training time for the models with different CFMs.

References

    1. Islam MR, et al. Emotion recognition from EEG signal focusing on deep learning and shallow learning techniques. IEEE Access. 2021;9:94601–94624. doi: 10.1109/ACCESS.2021.3091487. - DOI
    1. Khattak A, Asghar MZ, Ali M, Batool U. An efficient deep learning technique for facial emotion recognition. Multimed. Tools Appl. 2022;81(2):1649–1683. doi: 10.1007/s11042-021-11298-w. - DOI
    1. Morais, E., Hoory, R., Zhu, W., Gat, I., Damasceno, M., & Aronowitz, H. Speech emotion recognition using self-supervised features. In ICASSP 2022–2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 6922–6926 (2022). 10.1109/ICASSP43922.2022.9747870.
    1. Kessous L, Castellano G, Caridakis G. Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. J. Multimodal User Interfaces. 2009;3:33–48. doi: 10.1007/s12193-009-0025-5. - DOI
    1. Liu X, et al. Emotion recognition and dynamic functional connectivity analysis based on EEG. IEEE Access. 2019;7:143293–143302. doi: 10.1109/ACCESS.2019.2945059. - DOI

Publication types