Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Aug 5;22(15):5860.
doi: 10.3390/s22155860.

A Novel Quick-Response Eigenface Analysis Scheme for Brain-Computer Interfaces

Affiliations

A Novel Quick-Response Eigenface Analysis Scheme for Brain-Computer Interfaces

Hojong Choi et al. Sensors (Basel). .

Abstract

The brain-computer interface (BCI) is used to understand brain activities and external bodies with the help of the motor imagery (MI). As of today, the classification results for EEG 4 class BCI competition dataset have been improved to provide better classification accuracy of the brain computer interface systems (BCIs). Based on this observation, a novel quick-response eigenface analysis (QR-EFA) scheme for motor imagery is proposed to improve the classification accuracy for BCIs. Thus, we considered BCI signals in standardized and sharable quick response (QR) image domain; then, we systematically combined EFA and a convolution neural network (CNN) to classify the neuro images. To overcome a non-stationary BCI dataset available and non-ergodic characteristics, we utilized an effective neuro data augmentation in the training phase. For the ultimate improvements in classification performance, QR-EFA maximizes the similarities existing in the domain-, trial-, and subject-wise directions. To validate and verify the proposed scheme, we performed an experiment on the BCI dataset. Specifically, the scheme is intended to provide a higher classification output in classification accuracy performance for the BCI competition 4 dataset 2a (C4D2a_4C) and BCI competition 3 dataset 3a (C3D3a_4C). The experimental results confirm that the newly proposed QR-EFA method outperforms the previous the published results, specifically from 85.4% to 97.87% ± 0.75 for C4D2a_4C and 88.21% ± 6.02 for C3D3a_4C. Therefore, the proposed QR-EFA could be a highly reliable and constructive framework for one of the MI classification solutions for BCI applications.

Keywords: eigenface analysis; image data augmentation; motor imagery classification; quick response neuro images; standardized and sharable quick response eigenfaces.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
The BCI layer model compared to OSI network model.
Figure 2
Figure 2
Overall collaboration between QR-EFA and EFA.
Figure 3
Figure 3
The EFA algorithm procedure.
Figure 4
Figure 4
Data analysis on the viewpoint direction.
Figure 5
Figure 5
Overall flowchart for QR-EFA (solid line) including EFA (dashed line). Cross reference in the above data flowchart for step 1~11.
Figure 6
Figure 6
The CNN for neuro images in QR-EFA.
Figure 7
Figure 7
An original eigenface images of subjects for training data for 4 classes: left, right, feet, and tongue after whitening (12 × 10: = 30 × 4) in C3D3a_4C.
Figure 8
Figure 8
Some selected sample QR training images of data augmentation for the class of (a) left and (b) tongue (12 × 10) in C3D3a_4C.
Figure 9
Figure 9
Primitive raw images for EEG data signal for the selected first and last trials (80 × 60) in C3D3a_4C.
Figure 10
Figure 10
Training QR code realization of subject 1 at trial 1 for 4 classes: left, right, feet, and tongue (12 × 10) in C3D3a_4C.
Figure 11
Figure 11
Data graph of the CNN cost function evolution vs. number of epochs in C3D3a_4C.
Figure 12
Figure 12
An original eigenface images of subjects for training data for 4 classes: left, right, feet, and tongue after whitening (18 × 16: = 72 × 4) in C4D2a_4C.
Figure 13
Figure 13
Some selected sample QR training images of data augmentation for the classes of (a) left and (b) tongue (18 × 16 in C4D2a_4C).
Figure 14
Figure 14
Primitive raw images for EEG data signal for the selected first and last trials (50 × 50) in C4D2a_4C.
Figure 15
Figure 15
Training QR code realization of subject 1 at trial 1 for 4 classes: left, right, feet, and tongue (18 × 16) in C4D2a_4C.
Figure 16
Figure 16
Data graph of the CNN cost function evolution vs. number of epochs in C4D2a_4C.

Similar articles

Cited by

References

    1. Classification Ranking for EEG 4 Classes on BCI Competition IV 2a. [(accessed on 22 April 2022)]. Available online: https://paperswithcode.com/sota/eeg-4-classes-on-bci-competition-iv-2a.
    1. Pfurtscheller G., Neuper C., Guger C., Harkam W., Ramoser H., Schlogl A., Obermaier B., Pregenzer M. Current trends in Graz brain-computer interface (BCI) research. IEEE Trans. Neural Syst. Rehabil. Eng. 2000;8:216–219. doi: 10.1109/86.847821. - DOI - PubMed
    1. Schalk G., McFarland D.J., Hinterberger T., Birbaumer N., Wolpaw J.R. BCI2000: A general-purpose brain-computer interface (BCI) system. IEEE Trans. Biomed. Eng. 2004;51:1034–1043. doi: 10.1109/TBME.2004.827072. - DOI - PubMed
    1. Belwafi K., Gannouni S., Aboalsamh H. An Effective Zeros-Time Windowing Strategy to Detect Sensorimotor Rhythms Related to Motor Imagery EEG Signals. IEEE Access. 2020;8:152669–152679. doi: 10.1109/ACCESS.2020.3017888. - DOI
    1. Nam C.S., Nijholt A., Lotte F. Brain–Computer Interfaces Handbook: Technological and Theoretical Advances. CRC Press; Boca Raton, FL, USA: 2018.