Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Sep 29;10(10):687.
doi: 10.3390/brainsci10100687.

Advances in Multimodal Emotion Recognition Based on Brain-Computer Interfaces

Affiliations
Review

Advances in Multimodal Emotion Recognition Based on Brain-Computer Interfaces

Zhipeng He et al. Brain Sci. .

Abstract

With the continuous development of portable noninvasive human sensor technologies such as brain-computer interfaces (BCI), multimodal emotion recognition has attracted increasing attention in the area of affective computing. This paper primarily discusses the progress of research into multimodal emotion recognition based on BCI and reviews three types of multimodal affective BCI (aBCI): aBCI based on a combination of behavior and brain signals, aBCI based on various hybrid neurophysiology modalities and aBCI based on heterogeneous sensory stimuli. For each type of aBCI, we further review several representative multimodal aBCI systems, including their design principles, paradigms, algorithms, experimental results and corresponding advantages. Finally, we identify several important issues and research directions for multimodal emotion recognition based on BCI.

Keywords: affective computing; brain–computer interface (BCI); emotion recognition; multimodal fusion.

PubMed Disclaimer

Conflict of interest statement

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Figures

Figure 1
Figure 1
Flowchart of multimodal emotional involvement for electroencephalogram (EEG)-based affective brain–computer interfaces (BCI).
Figure 2
Figure 2
(a) Flowchart of emotion recognition based on expression. The facial expression recognition process includes three stages: face location recognition, feature extraction and expression classification. In the face location and recognition part, two technologies are usually adopted: feature-based and image-based [49]. The most commonly used emotion recognition methods for facial expressions are geometric and texture feature recognition and facial action unit recognition. Expressions are usually classified into seven basic expressions—fear, disgust, joy, anger, sadness, surprise and contempt—but people’s emotional states are complex and can be further divided into a series of combined emotions, including complex expressions, abnormal expressions and microexpressions; (b) Overview of emotionally relevant features of eye movement (AOI 1: The first area of interest, AOI 2: The second area of interest). By collecting data regarding pupil diameter, gaze fixation and saccade, which are three basic eye movement characteristics, their characteristics can be analyzed and counted, including frequency events and special values of frequency event information (e.g., observing fixed frequency and collecting fixed dispersion total/maximum values).
Figure 3
Figure 3
(a) Relevant emotion features extracted from different physiological signals. Physiological features and where they were collected are listed, such as EEG, EOG, RSP, EMG, GSR and BVP temperature. We can extract emotionally relevant features from these physiological signals that change over time in three dimensions: the time, frequency and time-43frequency domains; (b) Measurement of brain activity by EEG, fNIRS and fMRI. Briefly, we introduced how these three kinds of brain imaging signals collected from the cerebral cortex work with fMRI based on radio frequency (RF) transmit and RF receive; fNIRS is based on light emission and light detection and EEG is based on electrical potentials.
Figure 4
Figure 4
Open challenges and opportunities in multimodal emotion recognition for EEG-based brain–computer interfaces (BCI).

Similar articles

Cited by

References

    1. Wolpaw J.R., Birbaumer N., McFarland D.J., Pfurtscheller G., Vaughan T.M. Brain–computer interfaces for communication and control. Clin. Neurophysiol. 2002;113:767–791. doi: 10.1016/S1388-2457(02)00057-3. - DOI - PubMed
    1. Mühl C., Nijholt A., Allison B., Dunne S., Heylen D. Affective brain-computer interfaces (aBCI 2011); Proceedings of the International Conference on Affective Computing and Intelligent Interaction; Memphis, TN, USA. 9–12 October 2011; p. 435.
    1. Mühl C., Allison B., Nijholt A., Chanel G. A survey of affective brain computer interfaces: Principles, state-of-the-art, and challenges. Brain-Comput. Interfaces. 2014;1:66–84. doi: 10.1080/2326263X.2014.912881. - DOI
    1. Van den Broek E.L. Cognitive Behavioural Systems. Springer; Berlin/Heidelberg, Germany: 2012. Affective computing: A reverence for a century of research; pp. 434–448.
    1. Ekman P.E., Davidson R.J. The Nature of Emotion: Fundamental Questions. Oxford University Press; Oxford, UK: 1994.

LinkOut - more resources