Hands-Free User Interface for AR/VR Devices Exploiting Wearer's Facial Gestures Using Unsupervised Deep Learning
- PMID: 31614988
- PMCID: PMC6832972
- DOI: 10.3390/s19204441
Hands-Free User Interface for AR/VR Devices Exploiting Wearer's Facial Gestures Using Unsupervised Deep Learning
Abstract
Developing a user interface (UI) suitable for headset environments is one of the challenges in the field of augmented reality (AR) technologies. This study proposes a hands-free UI for an AR headset that exploits facial gestures of the wearer to recognize user intentions. The facial gestures of the headset wearer are detected by a custom-designed sensor that detects skin deformation based on infrared diffusion characteristics of human skin. We designed a deep neural network classifier to determine the user's intended gestures from skin-deformation data, which are exploited as user input commands for the proposed UI system. The proposed classifier is composed of a spatiotemporal autoencoder and deep embedded clustering algorithm, trained in an unsupervised manner. The UI device was embedded in a commercial AR headset, and several experiments were performed on the online sensor data to verify operation of the device. We achieved implementation of a hands-free UI for an AR headset with average accuracy of 95.4% user-command recognition, as determined through tests by participants.
Keywords: augmented reality; deep embedded clustering; hands-free interface; spatiotemporal autoencoder.
Conflict of interest statement
The authors declare no conflicts of interest.
Figures










References
-
- Cha J., Kim J., Kim S. Noninvasive determination of fiber orientation and tracking 2-dimensional deformation of human skin utilizing spatially resolved reflectance of infrared light measurement in vivo. Measurement. 2019;142:170–180. doi: 10.1016/j.measurement.2019.04.065. - DOI
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
Research Materials