Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Aug 6;10(8):524.
doi: 10.3390/brainsci10080524.

A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli

Affiliations

A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli

Boyang Zhang et al. Brain Sci. .

Abstract

To date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance of gaze independent ERP-BCIs, it is necessary to study techniques to improve the performance of these BCI systems. In this paper, we developed a novel 36-class bimodal ERP-BCI system based on tactile and auditory stimuli, in which six-virtual-direction audio files produced via head related transfer functions (HRTF) were delivered through headphones and location-congruent electro-tactile stimuli were simultaneously delivered to the corresponding position using electrodes placed on the abdomen and waist. We selected the eight best channels, trained a Bayesian linear discriminant analysis (BLDA) classifier and acquired the optimal trial number for target selection in online process. The average online information transfer rate (ITR) of the bimodal ERP-BCI reached 11.66 bit/min, improvements of 35.11% and 36.69% compared to the auditory (8.63 bit/min) and tactile approaches (8.53 bit/min), respectively. The results demonstrate the performance of the bimodal system is superior to each unimodal system. These facts indicate that the proposed bimodal system has potential utility as a gaze-independent BCI in future real-world applications.

Keywords: ANOVA; BCI; BLDA; ERP; HRTF; auditory; bimodal stimulus; electro-tactile; gaze-independent; location-congruent.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Illustration of the character distribution of the 36-class brain-computer interface (BCI) system and the bimodal stimulus paradigm. In subfigure (a), the prompted character “J”, as an example, is marked red. In subfigure (b), the stimuli were delivered from corresponding location. Each yellow broken line represents a pair of electrical tactile electrodes placed at waist level. The arrows represent the directions of auditory stimuli delivered from headphones. The numbers identify the stimulus codes (location-congruent bimodal stimuli from “1” to “6”).
Figure 2
Figure 2
Illustration of electro-tactile impulse stimuli.
Figure 3
Figure 3
Stimulus sequence generation.
Figure 4
Figure 4
Configuration of electrode locations.
Figure 5
Figure 5
Flow diagram of Jumpwise regression algorithm. The value pU is the p-value of the partial F-test statistics of the model containing all features of the selected channel set (SC) against the features in SC except those belonging to the unselected channel set (UC), and the value pS is the p-value of the partial F-test statistics of the model containing all features of SC against the features in SC except those belonging to SC. The p-value thresholds for adding and removing channels were set to 0.1 and 0.15, respectively.
Figure 6
Figure 6
Offline ITR (a) and classification accuracy (b) of auditory, tactile and bimodal BCI using the first ten runs of each session. Error bars denote the standard deviation.
Figure 7
Figure 7
Classification accuracy comparisons among the six directions of auditory (a), tactile (b), and bimodal (c) approaches. The green line divided the stimulus area into the right-posterior part (“4”, “5” and “6”) and the left-anterior part (“1”, “2” and “3”) for the ease of analyzing significance.
Figure 8
Figure 8
The results of channel selection for each subject. (ac) denotes the eight optimal channels of each subject for the three modalities; (df) represents the total number of selected channels on the scalp for the twelve subjects. The color bar from dark blue to dark red indicates the summed number of channel selections for each electrode from zero to eight times, respectively.

References

    1. Wolpaw J.R., Birbaumer N., Heetderks W., Mcfarland D., Peckham P., Schalk G., Donchin E., Quatrano L., Robinson C., Vaughan T. Brain–computer interface technology: A review of the first international meeting. IEEE Trans. Rehabil. Eng. 2000;8:164–173. doi: 10.1109/TRE.2000.847807. - DOI - PubMed
    1. Allison B.Z., Wolpaw E.W., Wolpaw J.R. Brain–computer interface systems: Progress and prospects. Expert Rev. Med. Devices. 2007;4:463–474. doi: 10.1586/17434440.4.4.463. - DOI - PubMed
    1. Xu M., Xiao X., Wang Y., Qi H., Jung T.P., Ming D. A brain computer interface based on miniature event-related potentials induced by very small lateral visual stimuli. IEEE Trans. Biomed. Eng. 2018;65:1166–1175. - PubMed
    1. Coyle S., Ward T., Markham C. Brain computer interfaces, a review. Int. Sci. Rev. 2003;28:112–118. doi: 10.1179/030801803225005102. - DOI
    1. Lance B.J., Kerick S.E., Ries A.J., Oie K.S., Mcdowell K. Brain-computer interface technologies in the coming decades. Proc. IEEE. 2012;100:1585–1599. doi: 10.1109/JPROC.2012.2184830. - DOI

LinkOut - more resources