Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Jun 9;7(14):2000261.
doi: 10.1002/advs.202000261. eCollection 2020 Jul.

Machine Learning Glove Using Self-Powered Conductive Superhydrophobic Triboelectric Textile for Gesture Recognition in VR/AR Applications

Affiliations

Machine Learning Glove Using Self-Powered Conductive Superhydrophobic Triboelectric Textile for Gesture Recognition in VR/AR Applications

Feng Wen et al. Adv Sci (Weinh). .

Abstract

The rapid progress of Internet of things (IoT) technology raises an imperative demand on human machine interfaces (HMIs) which provide a critical linkage between human and machines. Using a glove as an intuitive and low-cost HMI can expediently track the motions of human fingers, resulting in a straightforward communication media of human-machine interactions. When combining several triboelectric textile sensors and proper machine learning technique, it has great potential to realize complex gesture recognition with the minimalist-designed glove for the comprehensive control in both real and virtual space. However, humidity or sweat may negatively affect the triboelectric output as well as the textile itself. Hence, in this work, a facile carbon nanotubes/thermoplastic elastomer (CNTs/TPE) coating approach is investigated in detail to achieve superhydrophobicity of the triboelectric textile for performance improvement. With great energy harvesting and human motion sensing capabilities, the glove using the superhydrophobic textile realizes a low-cost and self-powered interface for gesture recognition. By leveraging machine learning technology, various gesture recognition tasks are done in real time by using gestures to achieve highly accurate virtual reality/augmented reality (VR/AR) controls including gun shooting, baseball pitching, and flower arrangement, with minimized effect from sweat during operation.

Keywords: gesture recognition; machine learning; superhydrophobic textiles; triboelectric nanogenerators (TENGs); virtual reality/augmented reality (VR/AR) controls.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
a) Schematic diagram of diversified applications that can be enabled by the developed superhydrophobic triboelectric textile. The man model is reproduced with permission from Freepik.com (https://www.freepik.com/). bi) Fabrication process of the superhydrophobic textile (i.e., positive layer), bii) The SEM image of fibers of superhydrophobic textile and biii) the enlarged view. c) Configuration of the superhydrophobic textile TENG. d) The image of the superhydrophobic textile TENG held by a tweezer, showing its flexibility. e) The triboelectric mechanism of two structures.
Figure 2
Figure 2
The basic characterization and optimization of TPE content. a) The V oc, b) I sc output, and c) their dependence on TPE content. d) The contact angle, e) sheet resistance, and f) retention rate of V oc after 5000 cycles of mechanical loading varies with TPE content. g) The degree of remained V oc with increasing RH from 35% to 85% of pristine textile and superhydrophobic textile, indicating the humidity resistance capability of each textile. The real‐time output voltage of textile h) without and i) with superhydrophobic treatment under increased RH from 35% to 62%. (The untreated textile is nonconductive. Therefore, the commercialized conductive textile is attached on the back of it to serve as the electrode.)
Figure 3
Figure 3
The biomechanical energy harvesting of the superhydrophobic textile TENG. a) Calibration curve of output voltage against force applied by standard weights. b) The real‐time output voltage under increased relative humidity. c) The schematic diagram of device attached on human elbow to harvest the elbow bending energy. d) The output voltage with bending angle of 60°, 90°, 120°. e) The schematic diagram of energy harvesting based on hand tapping, f) the output voltage with small, medium and large force. g) The schematic diagram of energy harvesting based on walking and running. h) The output voltage with slow walking, fast walking and running. i) The power curves of elbow bending, hand tapping, running by using treated textile and untreated textile. j) The charging curves of elbow bending, hand tapping, and running. k) The photographs of powering an electronic watch and calculator using the stored electrical energy in a 10 µF capacitor by biomechanical energy harvesting.
Figure 4
Figure 4
The human exercise monitoring. a) The schematic diagram of human exercise monitoring system with the calculation capability of moving steps, velocity, distance, and burned calories. The man model is reproduced with permission from Freepik.com (https://www.freepik.com/). b) The calibration curves of output voltage on sweat volume with slowing walking, fast walking, and running based on treated textile and untreated textile. The output voltage of 1 h slowing walking fast walking and running with treated (red line) and untreated textile (blue line). The c–ei) output voltage, c–eii) time interval, c–eiii) instantaneous velocity, and c–eiv) the display interface of calculated steps, distance, and burned calories with c) slow walking, d) fast walking, and e) running.
Figure 5
Figure 5
The demonstration of shooting game, which is based on the amplitude of output signals. a) The schematic diagram of the control system. b) Signal patterns of grabbing, loading gun, and shooting. c) The corresponding screenshot of grabbing, loading gun, and shooting in VR space of Unity.
Figure 6
Figure 6
The demonstration of baseball game scenario with machine learning. a) The flow chart for gesture recognition and control. b) The structure of CNN model. c) The signal patterns of 3 gestures. d) The confusion matrix for 3 common gestures of pitching ball. e) The photographs of 3 gestures (left), and corresponding screenshot of using gestures to achieve VR control in Unity (right).
Figure 7
Figure 7
The illustration of superhydrophobic glove for better accuracy under sweat condition of wearer. a) Photographs of four gestures with similar signal patterns. The signal patterns of untreated and treated glove b) without sweat and e) with sweat. Without sweat, the confusion matrix for c) untreated glove and d) treated glove. With sweat, the confusion matrix for f) untreated glove and g) treated glove, showing that the glove with superhydrophobic textile maintaining a high accuracy even in a sweating condition.
Figure 8
Figure 8
The AR demonstration of flower arrangement based on complex gesture recognition using machine learning. a) The photographs of eleven gestures. b) Signal patterns of eleven gestures. c) The confusion matrix of gesture recognition. d) The corresponding screenshot of eleven gestures in AR space of Unity.

References

    1. Zhu M., Shi Q., He T., Yi Z., Ma Y., Yang B., Chen T., Lee C., ACS Nano 2019, 13, 1940. - PubMed
    1. Shi G., Zhao Z., Pai J. H., Lee I., Zhang L., Stevenson C., Ishara K., Zhang R., Zhu H., Ma J., Adv. Funct. Mater. 2016, 26, 7614.
    1. Wu Z., Wang Y., Liu X., Lv C., Li Y., Wei D., Liu Z., Adv. Mater. 2019, 31, 1800716. - PubMed
    1. Ge G., Lu Y., Qu X., Zhao W., Ren Y., Wang W., Wang Q., Huang W., Dong X., ACS Nano 2020, 14, 218. - PubMed
    1. Liu Z., Ma Y., Ouyang H., Shi B., Li N., Jiang D., Xie F., Qu D., Zou Y., Huang Y., Adv. Funct. Mater. 2019, 29, 1807560.

LinkOut - more resources