Transforming Motor Imagery Analysis: A Novel EEG Classification Framework Using AtSiftNet Method
- PMID: 39409506
- PMCID: PMC11479282
- DOI: 10.3390/s24196466
Transforming Motor Imagery Analysis: A Novel EEG Classification Framework Using AtSiftNet Method
Abstract
This paper presents an innovative approach for the Feature Extraction method using Self-Attention, incorporating various Feature Selection techniques known as the AtSiftNet method to enhance the classification performance of motor imaginary activities using electrophotography (EEG) signals. Initially, the EEG signals were sorted and then denoised using multiscale principal component analysis to obtain clean EEG signals. However, we also conducted a non-denoised experiment. Subsequently, the clean EEG signals underwent the Self-Attention feature extraction method to compute the features of each trial (i.e., 350×18). The best 1 or 15 features were then extracted through eight different feature selection techniques. Finally, five different machine learning and neural network classification models were employed to calculate the accuracy, sensitivity, and specificity of this approach. The BCI competition III dataset IV-a was utilized for all experiments, encompassing the datasets of five volunteers who participated in the competition. The experiment findings reveal that the average accuracy of classification is highest among ReliefF (i.e., 99.946%), Mutual Information (i.e., 98.902%), Independent Component Analysis (i.e., 99.62%), and Principal Component Analysis (i.e., 98.884%) for both 1 and 15 best-selected features from each trial. These accuracies were obtained for motor imagery using a Support Vector Machine (SVM) as a classifier. In addition, five-fold validation was performed in this paper to assess the fair performance estimation and robustness of the model. The average accuracy obtained through five-fold validation is 99.89%. The experiments' findings indicate that the suggested framework provides a resilient biomarker with minimal computational complexity, making it a suitable choice for advancing Motor Imagery Brain-Computer Interfaces (BCI).
Keywords: attention sift network (AtSiftNet); brain–computer interface (BCI); independent component analysis (ICA); motor imagery (MI); principal component analysis (PCA).
Conflict of interest statement
The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.
Figures






















References
-
- Chowdhury A., Raza H., Meena Y.K., Dutta A., Prasad G. Online covariate shift detection-based adaptive brain–computer interface to trigger hand exoskeleton feedback for neuro-rehabilitation. IEEE Trans. Cogn. Dev. Syst. 2017;10:1070–1080. doi: 10.1109/TCDS.2017.2787040. - DOI
-
- Jiang Y., Hau N.T., Chung W.Y. Semiasynchronous BCI using wearable two-channel EEG. IEEE Trans. Cogn. Dev. Syst. 2017;10:681–686. doi: 10.1109/TCDS.2017.2716973. - DOI
-
- Wu D., Xu Y., Lu B.L. Transfer learning for EEG-based brain–computer interfaces: A review of progress made since 2016. IEEE Trans. Cogn. Dev. Syst. 2020;14:4–19. doi: 10.1109/TCDS.2020.3007453. - DOI
-
- Sadiq M.T., Yu X., Yuan Z., Aziz M.Z., Siuly S., Ding W. A matrix determinant feature extraction approach for decoding motor and mental imagery EEG in subject-specific tasks. IEEE Trans. Cogn. Dev. Syst. 2020;14:375–387. doi: 10.1109/TCDS.2020.3040438. - DOI
MeSH terms
Grants and funding
- 52172387/National Natural Science Foundation of China
- U2033202, U1333119/Joint Fund of National Natural Science Foundation of China and Civil Aviation Administration of China
- ILA22032-1A/Fundamental Research Funds for the Central Universities
- 2022Z071052001/Aeronautical Science Foundation of China
- 2022JGZ14/Northwestern Polytechnical University
LinkOut - more resources
Full Text Sources
Miscellaneous