Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Oct;8(19):e2101129.
doi: 10.1002/advs.202101129. Epub 2021 Jul 17.

Wireless Soft Scalp Electronics and Virtual Reality System for Motor Imagery-Based Brain-Machine Interfaces

Affiliations

Wireless Soft Scalp Electronics and Virtual Reality System for Motor Imagery-Based Brain-Machine Interfaces

Musa Mahmood et al. Adv Sci (Weinh). 2021 Oct.

Abstract

Motor imagery offers an excellent opportunity as a stimulus-free paradigm for brain-machine interfaces. Conventional electroencephalography (EEG) for motor imagery requires a hair cap with multiple wired electrodes and messy gels, causing motion artifacts. Here, a wireless scalp electronic system with virtual reality for real-time, continuous classification of motor imagery brain signals is introduced. This low-profile, portable system integrates imperceptible microneedle electrodes and soft wireless circuits. Virtual reality addresses subject variance in detectable EEG response to motor imagery by providing clear, consistent visuals and instant biofeedback. The wearable soft system offers advantageous contact surface area and reduced electrode impedance density, resulting in significantly enhanced EEG signals and classification accuracy. The combination with convolutional neural network-machine learning provides a real-time, continuous motor imagery-based brain-machine interface. With four human subjects, the scalp electronic system offers a high classification accuracy (93.22 ± 1.33% for four classes), allowing wireless, real-time control of a virtual reality game.

Keywords: brain-machine interfaces; motor imagery brain signals; virtual reality system; wireless soft scalp electronics.

PubMed Disclaimer

Conflict of interest statement

Georgia Tech has a pending patent application related to the work described here.

Figures

Figure 1
Figure 1
Overview of the wireless scalp system for motor imagery brain signal detection, featuring fully portable electronics, stretchable interconnectors, and flexible microneedle arrays. A) An illustration of a subject wearing a VR headset and scalp electronics with a close‐up of stretchable interconnectors (top‐right photo) and a flexible microneedle electrode (bottom‐right photo). B) A zoomed‐in photo of an array of microneedles along with a magnified SEM image of the needles (inset). C) A picture of a flexible wireless circuit with integrated chips, showing mechanical compliance of the membrane circuit. D) Locations (top view) of the microneedle electrodes for classification of motor imagery brain signals, including six recording channels, one reference (REF) and one ground (GND). E) A flow chart describing the entire data processing sequence from EEG recording to the control of targets in a virtual reality system via a machine learning classification algorithm (convolutional neural network, CNN).
Figure 2
Figure 2
Characterization of stretchable EEG interconnectors and flexible microneedle electrodes (FMNE). A) Photos of stretching of a stretchable EEG interconnector up to 100%. B) Electrical measurement of resistance change of the interconnector with 60% strain for 100 cycles, showing negligible changes in resistance. The inset shows the resistance change over a single stretch cycle. C) Measurement of electrical resistance change of the interconnector, showing mechanical fracture after 250% of tensile stretching. D) A series of SEM close‐up images microneedle electrode tip: unused (left, inset: zoom‐in view of the tip), after ten insertions into porcine skin (middle), and after 100 insertions into the tissue (right). E) Skin–electrode contact impedance that compares the performance between FMNE and conventional Ag/AgCl cup electrodes (error: standard deviation with n = 4 subjects). F) Calculated impedance density from data (E), showing a dramatic increase of the impedance change from the conventional electrode. G) SNR comparison of EEG alpha rhythms measured by FMNE (top) and conventional Ag/AgCl electrodes (bottom).
Figure 3
Figure 3
Preprocessing and classification of motor imagery brain signals with CNNs. A) Detailed illustration of a spatial CNN model with hidden layers of brain signals acquired from six EEG channels. This model demonstrates the capability of decomposing spatial features from multiple dipolar sources of the motor cortex. B) Comparison of spatial‐CNN classification accuracy of four cases, including raw data, high‐pass filtered data (HPF), bandpass‐filtered data (Bandpass), and power spectral density analysis (PSDA) across multiple window lengths (1, 2, and 4 s). Error bars show a standard error from four subjects. C) Comparison of spatial‐CNN classification accuracy between the conventional Ag/AgCl gel electrodes and the newly developed FMNE across multiple window lengths (1, 2, and 4 s). Error bars show a standard error from four subjects. D) A confusion matrix representing results from the real‐time accuracy test of motor image brain data, acquired by conventional Ag/AgCl electrodes, with an overall accuracy of 89.65% (N = 2240 samples, window length = 4 s, and four human subjects). E) A confusion matrix representing results from the real‐time accuracy test of motor image brain data, acquired by FMNE, with an overall accuracy of 93.22% (N = 2240 samples, window length = 4 s, and four human subjects).
Figure 4
Figure 4
Virtual reality (VR) implementation for motor imagery training and real‐time control of a video game demonstration. A) An overview of a study setup, including a subject wearing the SSE, real‐time EEG data measured from six electrodes (top inset), an example of a VR interface (middle inset), and a photo of a subject wearing a VR headset (bottom‐right inset). Examples of the training and testing processes of a VR game. Details appear in Video S1 and S2 (Supporting Information). A video clip of a subject demonstrating highly accurate video game performance is shown in Video S3 (Supporting Information). B) A modified view of VR visuals provided to test a subject with text and animation prompts. C) A video game interface designed for MI response testing with clear color‐coded visual cues as well as a text prompt. D) An example of evaluation output according to the target class. E) An accuracy comparison between non‐VR setup and VR setup (two types of electrodes) classified with spatial‐CNN model, demonstrating the superior performance of VR as a training implement (n = 2240 samples from four subjects, 560 samples per subject, window length w = 4 s).

References

    1. Padfield N., Zabalza J., Zhao H., Masero V., Ren J., Sensors 2019, 19, 1423. - PMC - PubMed
    1. Venkatakrishnan A., Francisco G. E., Contreras‐Vidal J. L., Current Phys. Med. Rehabil. Rep. 2014, 2, 93. - PMC - PubMed
    1. Mahmood M., Mzurikwao D., Kim Y.‐S., Lee Y., Mishra S., Herbert R., Duarte A., Ang C. S., Yeo W.‐H., Nat. Mach. Intell. 2019, 1, 412.
    1. Norton J. J. S., Lee D. S., Lee J. W., Lee W., Kwon O., Won P., Jung S.‐Y., Cheng H., Jeong J.‐W., Akce A., Umunna S., Na I., Kwon Y. H., Wang X.‐Q., Liu Z., Paik U., Huang Y., Bretl T., Yeo W.‐H., Rogers J. A., Proc. Natl. Acad. Sci. USA 2015, 112, 3920. - PMC - PubMed
    1. a) Herbert R., Kim J.‐H., Kim Y. S., Lee H. M., Yeo W.‐H., Materials 2018, 11, 187; - PMC - PubMed
    2. b) Lim H. R., Kim H. S., Qazi R., Kwon Y. T., Jeong J. W., Yeo W. H., Adv. Mater. 2020, 32, 1901924; - PubMed
    3. c) Yeo W.‐H., Lee Y., J. Nat. Sci. 2015, 1, e132.

Publication types

LinkOut - more resources