Semi-supervised bipartite graph construction with active EEG sample selection for emotion recognition
- PMID: 38700614
- DOI: 10.1007/s11517-024-03094-z
Semi-supervised bipartite graph construction with active EEG sample selection for emotion recognition
Abstract
Electroencephalogram (EEG) signals are derived from the central nervous system and inherently difficult to camouflage, leading to the recent popularity of EEG-based emotion recognition. However, due to the non-stationary nature of EEG, inter-subject variabilities become obstacles for recognition models to well adapt to different subjects. In this paper, we propose a novel approach called semi-supervised bipartite graph construction with active EEG sample selection (SBGASS) for cross-subject emotion recognition, which offers two significant advantages. Firstly, SBGASS adaptively learns a bipartite graph to characterize the underlying relationships between labeled and unlabeled EEG samples, effectively implementing the semantic connection for samples from different subjects. Secondly, we employ active sample selection technique in this paper to reduce the impact of negative samples (outliers or noise in the data) on bipartite graph construction. Drawing from the experimental results with the SEED-IV data set, we have gained the following three insights. (1) SBGASS actively rejects negative labeled samples, which helps mitigate the impact of negative samples when constructing the optimal bipartite graph and improves the model performance. (2) Through the learned optimal bipartite graph in SBGASS, the transferability of labeled EEG samples is quantitatively analyzed, which exhibits a decreasing tendency as the distance between each labeled sample and the corresponding class centroid increases. (3) Besides the improved recognition accuracy, the spatial-frequency patterns in emotion recognition are investigated by the acquired projection matrix.
Keywords: Active sample selection; Bipartite graph; Electroencephalogram (EEG); Emotion recognition; Semi-supervised learning.
© 2024. International Federation for Medical and Biological Engineering.
References
-
- Wu D, Lu BL, Hu B, Zeng Z (2023) Affective brain-computer interfaces (aBCIs): a tutorial. Proc IEEE. https://doi.org/10.1109/JPROC.2023.3277471 - DOI
-
- Yang J, Lu H, Li C, Hu X, Hu B (2022) Data augmentation for depression detection using skeleton-based gait information. Med Biol Eng Comput 60(9):2665–2679. https://doi.org/10.1007/s11517-022-02595-z - DOI - PubMed
-
- Tang J, Liu D, Jin X, Peng Y, Zhao Q, Ding Y, Kong W (2022) BAFN: bi-direction attention based fusion network for multimodal sentiment analysis. IEEE Trans Circuits Syst Video Technol 33(4):1966–1978. https://doi.org/10.1109/TCSVT.2022.3218018 - DOI
-
- He Z, Li Z, Yang F, Wang L, Li J, Zhou C, Pan J (2020) Advances in multimodal emotion recognition based on brain-computer interfaces. Brain Sci 10(10):687. https://doi.org/10.3390/brainsci10100687 - DOI - PubMed - PMC
-
- Liu D, Cao T, Wang Q, Zhang M, Jiang X, Sun J (2023) Construction and analysis of functional brain network based on emotional electroencephalogram. Med Biol Eng Comput 61(2):357–385. https://doi.org/10.1007/s11517-022-02708-8 - DOI - PubMed
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
