Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Mar 1:11:68.
doi: 10.3389/fnhum.2017.00068. eCollection 2017.

Reaching and Grasping a Glass of Water by Locked-In ALS Patients through a BCI-Controlled Humanoid Robot

Affiliations

Reaching and Grasping a Glass of Water by Locked-In ALS Patients through a BCI-Controlled Humanoid Robot

Rossella Spataro et al. Front Hum Neurosci. .

Abstract

Locked-in Amyotrophic Lateral Sclerosis (ALS) patients are fully dependent on caregivers for any daily need. At this stage, basic communication and environmental control may not be possible even with commonly used augmentative and alternative communication devices. Brain Computer Interface (BCI) technology allows users to modulate brain activity for communication and control of machines and devices, without requiring a motor control. In the last several years, numerous articles have described how persons with ALS could effectively use BCIs for different goals, usually spelling. In the present study, locked-in ALS patients used a BCI system to directly control the humanoid robot NAO (Aldebaran Robotics, France) with the aim of reaching and grasping a glass of water. Four ALS patients and four healthy controls were recruited and trained to operate this humanoid robot through a P300-based BCI. A few minutes training was sufficient to efficiently operate the system in different environments. Three out of the four ALS patients and all controls successfully performed the task with a high level of accuracy. These results suggest that BCI-operated robots can be used by locked-in ALS patients as an artificial alter-ego, the machine being able to move, speak and act in his/her place.

Keywords: amyotrophic lateral sclerosis; brain computer interface; environmental control; humanoid robot; locked-in syndrome.

PubMed Disclaimer

Figures

Figure 1
Figure 1
The visual evoked potential (VEP) user interface. This interface consists of six low-level commands, corresponding to the four directions (forward, backward, left, and right) and two turn commands, and two high-level commands, grasp and give, which enable the robot to autonomously grasp and bring the glass.
Figure 2
Figure 2
The system architecture. The system consists of three main parts. The BCI architecture acquires EEG extract features and translates them into commands. The Network System creates an interface to send the selected command to the robot, which could be in a remote location. The Robotic System is composed of an AI Module which translates the received commands in actions of the Nao Robot.
Figure 3
Figure 3
The linear discriminant analysis. The stimuli are classified into two classes using the one-vs.-all paradigm. One class represents the selected item (x in the figure), the other class (circle) represents all the other items. The two classes are divided by a hyperplane that is the discriminant of the two classes. The process is iterated over all the items to find the class with the maximum distance from the hyperplane.
Figure 4
Figure 4
The two scenarios in which the robot operated. In scenario 1, the user is in bed, and selects two commands: grasp to take the object and give to bring it back. The robot will autonomously calculate the best path to accomplish the action. In scenario 2, the user sits on the table and controls the robot with low level (Forward, turn left, forward, turn left) and high-level (grasp, give) commands.

References

    1. Allison B. Z., Polich J. (2008). Workload assessment of computer gaming using a single stimulus event-related potential paradigm. Biol. Psychol. 77, 277–283. 10.1016/j.biopsycho.2007.10.014 - DOI - PMC - PubMed
    1. Allison B. Z., Wolpaw E. W., Wolpaw J. R. (2007). Brain-computer interface systems: progress and prospects. Expert. Med. Rev. Devices 4, 463–474. 10.1586/17434440.4.4.463 - DOI - PubMed
    1. Baykara E., Ruf C. A., Fioravanti C., Käthner I., Simon N., Kleih S. C., et al. . (2016). Effects of training and motivation on auditory P300 brain-computer interface performance. Clin. Neurophysiol. 127, 379–387. 10.1016/j.clinph.2015.04.054 - DOI - PubMed
    1. Beck A. T., Beamesderfer A. (1974). Assessment of depression: the depression inventory. Mod. Probl. Pharmacopsychiatry 7, 151–169. 10.1159/000395074 - DOI - PubMed
    1. Bell C. J., Shenov P., Chalodhornm R., Rao R. P. N. (2008). Control of a humanoid robot by a noninvasive brain–computer interface in humans. J. Neural Eng. 5, 214–220. 10.1088/1741-2560/5/2/012 - DOI - PubMed

LinkOut - more resources