Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Sep 4;8(1):13176.
doi: 10.1038/s41598-018-31425-2.

An error-aware gaze-based keyboard by means of a hybrid BCI system

Affiliations

An error-aware gaze-based keyboard by means of a hybrid BCI system

Fotis P Kalaganis et al. Sci Rep. .

Abstract

Gaze-based keyboards offer a flexible way for human-computer interaction in both disabled and able-bodied people. Besides their convenience, they still lead to error-prone human-computer interaction. Eye tracking devices may misinterpret user's gaze resulting in typesetting errors, especially when operated in fast mode. As a potential remedy, we present a novel error detection system that aggregates the decision from two distinct subsystems, each one dealing with disparate data streams. The first subsystem operates on gaze-related measurements and exploits the eye-transition pattern to flag a typo. The second, is a brain-computer interface that utilizes a neural response, known as Error-Related Potentials (ErrPs), which is inherently generated whenever the subject observes an erroneous action. Based on the experimental data gathered from 10 participants under a spontaneous typesetting scenario, we first demonstrate that ErrP-based Brain Computer Interfaces can be indeed useful in the context of gaze-based typesetting, despite the putative contamination of EEG activity from the eye-movement artefact. Then, we show that the performance of this subsystem can be further improved by considering also the error detection from the gaze-related subsystem. Finally, the proposed bimodal error detection system is shown to significantly reduce the typesetting time in a gaze-based keyboard.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
(Top) Schematic outline of the error-aware keyboard. A hybrid BCI system, relying on brain activity patterns and eye-movement features, detects and deletes characters that are mistyped. The depicted Machine Learning (ML) modules correspond to linear SVMs. (Bottom) Timeline describing the sequence of events during the typesetting experiment. Initially, the participant starts gazing at the desired letter. When he completes a 500 ms time-interval of continuous gazing, the key is registered and simultaneously the associated visual indication is presented. The physiological responses following this indication are used to detect typesetting errors. We note that the “eye” icon was not presented in the experiments and it is only shown here for presentation clarity purposes.
Figure 2
Figure 2
Single-subject averaged brain activation traces for the correct (blue) and wrong (red) selections of buttons are shown in the top middle panel. Particular latencies are indicated on these traces (E: error; C: correct) and the corresponding topographies have been included in the top left/right panels. The traces shown in the bottom middle panel reflect eye-movement activity (derived by averaging correspondingly across the epochs of gaze-related signal). Zero time indicates the instant that the typing of the current letter has been completed and the eyes are free to move towards the next letter.
Figure 3
Figure 3
(Left) Scatter-plot of gaze centre displacements (derived by integrating the derivatives of eye position coordinates within a time interval that includes the key registration at 0 latency). Each dot indicates the main direction of the eye after a correct typesetting of a single letter. The point swarm has been partitioned into 4 groups, and the membership of each dot is indicated by colour. The associated brain-signal and eye-movement activity traces have been grouped accordingly and their (sub)averages are indicated in top right and bottom right respectively using the colour code defined in scatter-plot.
Figure 4
Figure 4
The grand average sensitivity and specificity values (a) along with the Utility gain (b), after 100 Monte-Carlo cross validation repetitions, with respect to threshold moving within the normalized SVM margins.

References

    1. Eugster MJ, et al. Natural brain-information interfaces: Recommending information by relevance inferred from human brain signals. Sci. reports. 2016;6:38580. doi: 10.1038/srep38580. - DOI - PMC - PubMed
    1. Dal Seno B, Matteucci M, Mainardi L. Online detection of p300 and error potentials in a bci speller. Comput. intelligence neuroscience. 2010;2010:11. doi: 10.1155/2010/307254. - DOI - PMC - PubMed
    1. Bin G, et al. A high-speed bci based on code modulation vep. J. neural engineering. 2011;8:025015. doi: 10.1088/1741-2560/8/2/025015. - DOI - PubMed
    1. Meng, J. et al. Noninvasive electroencephalogram based control of a robotic arm for reach and grasp tasks. Sci. reports6 (2016). - PMC - PubMed
    1. Matsuzawa, K. & Ishii, C. Control of an electric wheelchair with a brain-computer interface headset. In Advanced Mechatronic Systems (ICAMechS), 2016 International Conference on, 504–509 (IEEE, 2016).

Publication types