Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Oct 7;24(19):6466.
doi: 10.3390/s24196466.

Transforming Motor Imagery Analysis: A Novel EEG Classification Framework Using AtSiftNet Method

Affiliations

Transforming Motor Imagery Analysis: A Novel EEG Classification Framework Using AtSiftNet Method

Haiqin Xu et al. Sensors (Basel). .

Abstract

This paper presents an innovative approach for the Feature Extraction method using Self-Attention, incorporating various Feature Selection techniques known as the AtSiftNet method to enhance the classification performance of motor imaginary activities using electrophotography (EEG) signals. Initially, the EEG signals were sorted and then denoised using multiscale principal component analysis to obtain clean EEG signals. However, we also conducted a non-denoised experiment. Subsequently, the clean EEG signals underwent the Self-Attention feature extraction method to compute the features of each trial (i.e., 350×18). The best 1 or 15 features were then extracted through eight different feature selection techniques. Finally, five different machine learning and neural network classification models were employed to calculate the accuracy, sensitivity, and specificity of this approach. The BCI competition III dataset IV-a was utilized for all experiments, encompassing the datasets of five volunteers who participated in the competition. The experiment findings reveal that the average accuracy of classification is highest among ReliefF (i.e., 99.946%), Mutual Information (i.e., 98.902%), Independent Component Analysis (i.e., 99.62%), and Principal Component Analysis (i.e., 98.884%) for both 1 and 15 best-selected features from each trial. These accuracies were obtained for motor imagery using a Support Vector Machine (SVM) as a classifier. In addition, five-fold validation was performed in this paper to assess the fair performance estimation and robustness of the model. The average accuracy obtained through five-fold validation is 99.89%. The experiments' findings indicate that the suggested framework provides a resilient biomarker with minimal computational complexity, making it a suitable choice for advancing Motor Imagery Brain-Computer Interfaces (BCI).

Keywords: attention sift network (AtSiftNet); brain–computer interface (BCI); independent component analysis (ICA); motor imagery (MI); principal component analysis (PCA).

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

Figures

Figure 1
Figure 1
Proposed Framework for novel AtSiftNet method.
Figure 2
Figure 2
Block Diagram of (MSPCA).
Figure 3
Figure 3
A comparison graph between original EEG signal and Denoised signal.
Figure 4
Figure 4
Architecture of Self-attention feature extraction.
Figure 5
Figure 5
Classification accuracy (%) of all subjects and machine learning classifier.
Figure 6
Figure 6
3D bar chart of Sensitivity by using different classifiers.
Figure 7
Figure 7
Specificity of all subjects of MI task with all classifiers.
Figure 8
Figure 8
Performance parameters of Motor Imaginary dataset IV-a.
Figure 9
Figure 9
Specificity of MI dataset-Iva using SVM classifier.
Figure 10
Figure 10
Specificity of MI task dataset Iva using SVM classifier.
Figure 11
Figure 11
A graphical comparison between top one selected features and top 15 selected features.
Figure 12
Figure 12
Correlation Matrix for top 15 selected features.
Figure 13
Figure 13
Correlation Matrix for top 1 selected features.
Figure 14
Figure 14
Comparison of Accuracy (%) of both scenarios by using MIFs as feature selection and SVM as classifier.
Figure 15
Figure 15
A graphical visualization of Denoised and Non-denoised data.
Figure 16
Figure 16
Feature Extraction time of each subject.
Figure 17
Figure 17
Using all classifier training time of each subject.
Figure 18
Figure 18
Using all classifier Testing Time of each subject.
Figure 19
Figure 19
ROC curves and AUC values of feature selection technique.
Figure 20
Figure 20
ROC graphs by using MIFs as a feature selection technique.
Figure 21
Figure 21
Comparison of AUC values for top one selected features from each trial.
Figure 22
Figure 22
Comparison of AUC values for top 15 selected features from each trial.

References

    1. Chowdhury A., Raza H., Meena Y.K., Dutta A., Prasad G. Online covariate shift detection-based adaptive brain–computer interface to trigger hand exoskeleton feedback for neuro-rehabilitation. IEEE Trans. Cogn. Dev. Syst. 2017;10:1070–1080. doi: 10.1109/TCDS.2017.2787040. - DOI
    1. Jiang Y., Hau N.T., Chung W.Y. Semiasynchronous BCI using wearable two-channel EEG. IEEE Trans. Cogn. Dev. Syst. 2017;10:681–686. doi: 10.1109/TCDS.2017.2716973. - DOI
    1. Wu D., Xu Y., Lu B.L. Transfer learning for EEG-based brain–computer interfaces: A review of progress made since 2016. IEEE Trans. Cogn. Dev. Syst. 2020;14:4–19. doi: 10.1109/TCDS.2020.3007453. - DOI
    1. Choy C.S., Cloherty S.L., Pirogova E., Fang Q. Virtual reality assisted motor imagery for early post-stroke recovery: A review. IEEE Rev. Biomed. Eng. 2022;16:487–498. doi: 10.1109/RBME.2022.3165062. - DOI - PubMed
    1. Sadiq M.T., Yu X., Yuan Z., Aziz M.Z., Siuly S., Ding W. A matrix determinant feature extraction approach for decoding motor and mental imagery EEG in subject-specific tasks. IEEE Trans. Cogn. Dev. Syst. 2020;14:375–387. doi: 10.1109/TCDS.2020.3040438. - DOI

LinkOut - more resources