Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 Jan 26;11(1):e0146848.
doi: 10.1371/journal.pone.0146848. eCollection 2016.

Toward FRP-Based Brain-Machine Interfaces-Single-Trial Classification of Fixation-Related Potentials

Affiliations

Toward FRP-Based Brain-Machine Interfaces-Single-Trial Classification of Fixation-Related Potentials

Andrea Finke et al. PLoS One. .

Abstract

The co-registration of eye tracking and electroencephalography provides a holistic measure of ongoing cognitive processes. Recently, fixation-related potentials have been introduced to quantify the neural activity in such bi-modal recordings. Fixation-related potentials are time-locked to fixation onsets, just like event-related potentials are locked to stimulus onsets. Compared to existing electroencephalography-based brain-machine interfaces that depend on visual stimuli, fixation-related potentials have the advantages that they can be used in free, unconstrained viewing conditions and can also be classified on a single-trial level. Thus, fixation-related potentials have the potential to allow for conceptually different brain-machine interfaces that directly interpret cortical activity related to the visual processing of specific objects. However, existing research has investigated fixation-related potentials only with very restricted and highly unnatural stimuli in simple search tasks while participant's body movements were restricted. We present a study where we relieved many of these restrictions while retaining some control by using a gaze-contingent visual search task. In our study, participants had to find a target object out of 12 complex and everyday objects presented on a screen while the electrical activity of the brain and eye movements were recorded simultaneously. Our results show that our proposed method for the classification of fixation-related potentials can clearly discriminate between fixations on relevant, non-relevant and background areas. Furthermore, we show that our classification approach generalizes not only to different test sets from the same participant, but also across participants. These results promise to open novel avenues for exploiting fixation-related potentials in electroencephalography-based brain-machine interfaces and thus providing a novel means for intuitive human-machine interaction.

PubMed Disclaimer

Conflict of interest statement

Competing Interests: The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1
Left: A participant is performing the search task while his eye movements and EEG data are recorded simultaneously. The black box below the screen is the EyeFollower remote eye tracker. Right: Example scanpath of participant 1 who has to find the dart out of 12 objects in trial 48. The circles denote fixations, where the size is proportional to the fixation duration and the numbers represent their chronological order. The lines symbolize saccades. The Figure illustrates that participants apply a grid structure like scanpath to successively check the different objects for a match in the gaze-contingent search task.
Fig 2
Fig 2
Top row: Verification of the linear separability for the best (left-) and poorest (right) datasets by inspecting the projections of the feature vectors on a 2d space spanned by the 2 largest eigenvectors of a 3-class FDA. The results illustrate that the 3-class FDA separates the data samples into three clearly distinguishable classes for the best dataset depending if the fixation appeared on targets, non-targets and background. Although the overlapping areas are larger in case of the poorest dataset, the three distinguishable classes are clearly evident. Bottom left: Comparison of the ROCs of participant 8 in the intra-subject classification scheme using only EEG channels and only EOG channels. The ROC values are averaged over the ten cross-validations runs. Bottom right: Classification results for the 3-class FDA. The dotted black line indicates the chance level.
Fig 3
Fig 3. Grand average over all ten participants for target, non-target and background FRPs.
Fig 4
Fig 4. Classification results.
Intra-subject 10-fold cross-validation with all 12 EEG channels. Please note that the y-axis starts at the chance level.
Fig 5
Fig 5. Classification results.
Generalized inter-subject classifier using all 12 EEG channels. The classifier was trained on the compound data of nine participants and validated on the left-out set. The x-axis labels indicate the index of the test set. Please note that the y-axis starts at the chance level.
Fig 6
Fig 6. Classification results.
Intra-subject 10-fold cross-validation with only the EOG channels. Please note that the y-axis starts at the chance level.
Fig 7
Fig 7. The results of classifying compound feature vectors with both the EEG and the EOG channels in the target versus rest case.
The dotted line indicates the chance level.

Similar articles

Cited by

References

    1. Farwell L, Donchin E. Talking off the top of your head: towards a mental prosthesis utilizing event-related brain potentials. Electroencephalography and clinical Neurophysiology. 1988;70(S2):510–523. 10.1016/0013-4694(88)90149-6 - DOI - PubMed
    1. Lenhardt A, Kaper M, Ritter H. An Adaptive P300-Based Online Brain-Computer Interface. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2008;16(2):121–130. 10.1109/TNSRE.2007.912816 - DOI - PubMed
    1. Mueller-Putz GR, Scherer R, Brauneis C, Pfurtscheller G. Steady-state visual evoked potential (SSVEP)-based communication: impact of harmonic frequency components. Journal of Neural Engineering. 2005;2(4):123 Available from: http://stacks.iop.org/1741-2552/2/i=4/a=008. 10.1088/1741-2560/2/4/008 - DOI - PubMed
    1. Lesenfants D, Habbal D, Lugo Z, Lebeau M, Horki P, Amico E, et al. An independent SSVEP-based brain-computer interface in locked-in syndrome. Journal of Neural Engineering. 2014;11(3). 10.1088/1741-2560/11/3/035002 - DOI - PubMed
    1. Hwanga HJ, Lima JH, Junga YJ, Choia H, Leeb SW, Im CH. Development of an SSVEP-based BCI spelling system adopting a QWERTY-style LED keyboard. Journal of Neuroscience Methods. 2012;208:59–65. - PubMed

Publication types

LinkOut - more resources