Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Jun 30;1(6):e0000061.
doi: 10.1371/journal.pdig.0000061. eCollection 2022 Jun.

A pilot study of the Earable device to measure facial muscle and eye movement tasks among healthy volunteers

Affiliations

A pilot study of the Earable device to measure facial muscle and eye movement tasks among healthy volunteers

Matthew F Wipperman et al. PLOS Digit Health. .

Abstract

The Earable device is a behind-the-ear wearable originally developed to measure cognitive function. Since Earable measures electroencephalography (EEG), electromyography (EMG), and electrooculography (EOG), it may also have the potential to objectively quantify facial muscle and eye movement activities relevant in the assessment of neuromuscular disorders. As an initial step to developing a digital assessment in neuromuscular disorders, a pilot study was conducted to determine whether the Earable device could be utilized to objectively measure facial muscle and eye movements intended to be representative of Performance Outcome Assessments, (PerfOs) with tasks designed to model clinical PerfOs, referred to as mock-PerfO activities. The specific aims of this study were: To determine whether the Earable raw EMG, EOG, and EEG signals could be processed to extract features describing these waveforms; To determine Earable feature data quality, test re-test reliability, and statistical properties; To determine whether features derived from Earable could be used to determine the difference between various facial muscle and eye movement activities; and, To determine what features and feature types are important for mock-PerfO activity level classification. A total of N = 10 healthy volunteers participated in the study. Each study participant performed 16 mock-PerfOs activities, including talking, chewing, swallowing, eye closure, gazing in different directions, puffing cheeks, chewing an apple, and making various facial expressions. Each activity was repeated four times in the morning and four times at night. A total of 161 summary features were extracted from the EEG, EMG, and EOG bio-sensor data. Feature vectors were used as input to machine learning models to classify the mock-PerfO activities, and model performance was evaluated on a held-out test set. Additionally, a convolutional neural network (CNN) was used to classify low-level representations of the raw bio-sensor data for each task, and model performance was correspondingly evaluated and compared directly to feature classification performance. The model's prediction accuracy on the Earable device's classification ability was quantitatively assessed. Study results indicate that Earable can potentially quantify different aspects of facial and eye movements and may be used to differentiate mock-PerfO activities. Specially, Earable was found to differentiate talking, chewing, and swallowing tasks from other tasks with observed F1 scores >0.9. While EMG features contribute to classification accuracy for all tasks, EOG features are important for classifying gaze tasks. Finally, we found that analysis with summary features outperformed a CNN for activity classification. We believe Earable may be used to measure cranial muscle activity relevant for neuromuscular disorder assessment. Classification performance of mock-PerfO activities with summary features enables a strategy for detecting disease-specific signals relative to controls, as well as the monitoring of intra-subject treatment responses. Further testing is needed to evaluate the Earable device in clinical populations and clinical development settings.

PubMed Disclaimer

Conflict of interest statement

The authors have read the journal’s policy and the authors of this manuscript have the following competing interests: MFW, KFM, XW, YC, OL, AA, SCH, RA, and OH are current or former employees and shareholders of Regeneron Pharmaceuticals, Inc. GP, RRD, and TV are employees of Earable, Inc.

Figures

Fig 1
Fig 1
A. A schematic of the Earable device used in this pilot study. The left and right earpiece components, which sit over the ears of the participant, house the primary electrical components of the device and are connected by a flexible circuit within a connecting band that allows the device to be worn comfortably by patients with varying head shapes and sizes. The microcontroller unit (MCU) enables data acquisition, signal processing, and on-device data processing. A Bluetooth module is incorporated for data streaming directly to a local computer or mobile phone. Multi-channel, electrophysiological data is acquired through dry electrodes that contact the participant’s scalp above each over and over each mastoid process. Although not directly applicable to this pilot study, music and audio-based stimulation and therapy is made available to the patient via bone-conduction speakers. B. Signal Processing and feature extraction pipeline used in this study. A signal separation module is applied to the mixed signal derived from the Earable device, to separate the EEG, EMG, and EOG waves into their component parts. These signals are then subject to an event-based segmentation algorithm, and features extracted. C. Time and frequency representations of EMG activity resulting from a participant drinking water. This plot shows around 6.5s of EMG data in both the time (top) and frequency (bottom) domains. D. EMG activity visualized in the time domain over 16 activities.
Fig 2
Fig 2. Dimensionality reduction and visualization of Earable features.
A. Spearman correlation of all 161 Earable features against each other, represented as a heatmap. K = 6 clusters from K means clustering (optimal number) are shown. All 16 mock-PerfOs were pooled for the correlation analysis. B. UMAP dimension reduction of all 161 Earable features. Each individual activity repeat is a point on this graph. The color of the point represents the activities performed during that activity. C. Heatmap of all 161 Earable features (rows) for all activity repeats (columns). Columns are sorted first by the 16 activates in the pilot study, and within each activity, by participant, and then time of day when the activity was performed.
Fig 3
Fig 3
A. Activity level classification F1 scores for all Earable features (161 features), Boruta selected Earable features (101 features), and using raw waveform data (CNN). F1 scores range from 0 to 1, with 1 indicating perfect classification. B. Feature attribution analysis using SHapley Additive exPlanations (SHAP) values for each feature (row) for each activity (columns) determined on the model from the full set of 161 features. [28] SHAP values were z scored across all activities.

Similar articles

Cited by

References

    1. Brust JCM. Bradley’s Neurology in Clinical Practice, Sixth Edition. Neurology. 2013;81(12):1104. doi: 10.1212/WNL.0b013e3182a4a566 - DOI
    1. Fabbrini G, Defazio G, Colosimo C, Thompson PD, Berardelli A. Cranial movement disorders: clinical features, pathophysiology, differential diagnosis and treatment. Nat Clin Pract Neurol. 2009;5(2):93–105. Epub 2009/02/06. doi: 10.1038/ncpneuro1006 . - DOI - PubMed
    1. Ali MR, Myers T, Wagner E, Ratnu H, Dorsey ER, Hoque E. Facial expressions can detect Parkinson’s disease: preliminary evidence from videos collected online. NPJ Digit Med. 2021;4(1):129. Epub 2021/09/05. doi: 10.1038/s41746-021-00502-8 . - DOI - PMC - PubMed
    1. Vaughan A, Gardner D, Miles A, Copley A, Wenke R, Coulson S. A Systematic Review of Physical Rehabilitation of Facial Palsy. Front Neurol. 2020;11:222. Epub 2020/04/17. doi: 10.3389/fneur.2020.00222 . - DOI - PMC - PubMed
    1. Carter BT, Luke SG. Best practices in eye tracking research. Int J Psychophysiol. 2020;155:49–62. Epub 2020/06/07. doi: 10.1016/j.ijpsycho.2020.05.010 . - DOI - PubMed

LinkOut - more resources