Deep generative models with data augmentation to learn robust representations of movement intention for powered leg prostheses
- PMID: 36159881
- PMCID: PMC9499185
- DOI: 10.1109/tmrb.2019.2952148
Deep generative models with data augmentation to learn robust representations of movement intention for powered leg prostheses
Abstract
Intent recognition is a data-driven alternative to expert-crafted rules for triggering transitions between pre-programmed activity modes of a powered leg prosthesis. Movement-related signals from prosthesis sensors detected prior to movement completion are used to predict the upcoming activity. Usually, training data comprised of labeled examples of each activity are necessary; however, the process of collecting a sufficiently large and rich training dataset from an amputee population is tedious. In addition, covariate shift can have detrimental effects on a controller's prediction accuracy if the classifier's learned representation of movement intention is not robust enough. Our objective was to develop and evaluate techniques to learn robust representations of movement intention using data augmentation and deep neural networks. In an offline analysis of data collected from four amputee subjects across three days each, we demonstrate that our approach produced realistic synthetic sensor data that helped reduce error rates when training and testing on different days and different users. Our novel approach introduces an effective and generalizable strategy for augmenting wearable robotics sensor data, challenging a pre-existing notion that rehabilitation robotics can only derive limited benefit from state-of-the-art deep learning techniques typically requiring more vast amounts of data.
Keywords: data augmentation; deep learning; intent recognition; prosthesis control; signal processing.
Figures





References
-
- Hargrove LJ et al., “Intuitive Control of a Powered Prosthetic Leg During Ambulation,” JAMA, vol. 313, no. 22, p. 2244, Jun. 2015. - PubMed
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources