MSHANet: a multi-scale residual network with hybrid attention for motor imagery EEG decoding
- PMID: 39712122
- PMCID: PMC11655790
- DOI: 10.1007/s11571-024-10127-8
MSHANet: a multi-scale residual network with hybrid attention for motor imagery EEG decoding
Abstract
EEG decoding plays a crucial role in the development of motor imagery brain-computer interface. Deep learning has great potential to automatically extract EEG features for end-to-end decoding. Currently, the deep learning is faced with the chanllenge of decoding from a large amount of time-variant EEG to retain a stable peroformance with different sessions. This study proposes a multi-scale residual network with hybrid attention (MSHANet) to decode four motor imagery classes. The MSHANet combines a multi-head attention and squeeze-and-excitation attention to hybridly focus on important information of the EEG features; and applies a multi-scale residual block to extracts rich EEG features, sharing part of the block parameters to extract common features. Compared with seven state-of-the-art methods, the MSHANet exhits the best accuracy on BCI Competition IV 2a with an accuracy of 83.18% for session- specific task and 80.09% for cross-session task. Thus, the proposed MSHANet decodes the time-varying EEG robustly and can save the time cost of MI-BCI, which is beneficial for long-term use.
Keywords: Brain-computer interface; EEG decoding; Hybrid attention; Motor imagery; Multi-scale residual network.
© The Author(s), under exclusive licence to Springer Nature B.V. 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
Conflict of interest statement
Conflict of interestThe authors declare that they have no conflict of interest.
Similar articles
-
A transformer-based network with second-order pooling for motor imagery EEG classification.J Neural Eng. 2025 Jul 10;22(4). doi: 10.1088/1741-2552/adeae8. J Neural Eng. 2025. PMID: 40602422
-
MCTGNet: A Multi-Scale Convolution and Hybrid Attention Network for Robust Motor Imagery EEG Decoding.Bioengineering (Basel). 2025 Jul 17;12(7):775. doi: 10.3390/bioengineering12070775. Bioengineering (Basel). 2025. PMID: 40722467 Free PMC article.
-
TCANet: a temporal convolutional attention network for motor imagery EEG decoding.Cogn Neurodyn. 2025 Dec;19(1):91. doi: 10.1007/s11571-025-10275-5. Epub 2025 Jun 14. Cogn Neurodyn. 2025. PMID: 40524963
-
Speech imagery brain-computer interfaces: a systematic literature review.J Neural Eng. 2025 Jun 26;22(3). doi: 10.1088/1741-2552/ade28e. J Neural Eng. 2025. PMID: 40490003 Review.
-
Advances in brain-computer interface for decoding speech imagery from EEG signals: a systematic review.Cogn Neurodyn. 2024 Dec;18(6):3565-3583. doi: 10.1007/s11571-024-10167-0. Epub 2024 Sep 4. Cogn Neurodyn. 2024. PMID: 39712121 Review.
Cited by
-
Towards decoding motor imagery from EEG signal using optimized back propagation neural network with honey badger algorithm.Sci Rep. 2025 Jul 1;15(1):21202. doi: 10.1038/s41598-025-05423-0. Sci Rep. 2025. PMID: 40594365 Free PMC article.
References
-
- Altaheri H, Muhammad G, Alsulaiman M (2022) Physics-informed attention temporal convolutional network for EEG-based motor imagery classification. IEEE Trans Industr Inf 19(2):2249–2258
-
- Bai S, Kolter JZ, Koltun V (2018) An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv Preprint arXiv :180301271
-
- Brunner C, Leeb R, Müller-Putz G et al (2008) BCI Competition 2008–Graz data set A. Institute for Knowledge Discovery (Laboratory of Brain-Computer interfaces). Graz Univ Technol 16:1–6
LinkOut - more resources
Full Text Sources