Environment Classification for Robotic Leg Prostheses and Exoskeletons Using Deep Convolutional Neural Networks
- PMID: 35185507
- PMCID: PMC8855111
- DOI: 10.3389/fnbot.2021.730965
Environment Classification for Robotic Leg Prostheses and Exoskeletons Using Deep Convolutional Neural Networks
Abstract
Robotic leg prostheses and exoskeletons can provide powered locomotor assistance to older adults and/or persons with physical disabilities. However, the current locomotion mode recognition systems being developed for automated high-level control and decision-making rely on mechanical, inertial, and/or neuromuscular sensors, which inherently have limited prediction horizons (i.e., analogous to walking blindfolded). Inspired by the human vision-locomotor control system, we developed an environment classification system powered by computer vision and deep learning to predict the oncoming walking environments prior to physical interaction, therein allowing for more accurate and robust high-level control decisions. In this study, we first reviewed the development of our "ExoNet" database-the largest and most diverse open-source dataset of wearable camera images of indoor and outdoor real-world walking environments, which were annotated using a hierarchical labeling architecture. We then trained and tested over a dozen state-of-the-art deep convolutional neural networks (CNNs) on the ExoNet database for image classification and automatic feature engineering, including: EfficientNetB0, InceptionV3, MobileNet, MobileNetV2, VGG16, VGG19, Xception, ResNet50, ResNet101, ResNet152, DenseNet121, DenseNet169, and DenseNet201. Finally, we quantitatively compared the benchmarked CNN architectures and their environment classification predictions using an operational metric called "NetScore," which balances the image classification accuracy with the computational and memory storage requirements (i.e., important for onboard real-time inference with mobile computing devices). Our comparative analyses showed that the EfficientNetB0 network achieves the highest test accuracy; VGG16 the fastest inference time; and MobileNetV2 the best NetScore, which can inform the optimal architecture design or selection depending on the desired performance. Overall, this study provides a large-scale benchmark and reference for next-generation environment classification systems for robotic leg prostheses and exoskeletons.
Keywords: artificial intelligence; biomechatronics; computer vision; deep learning; exoskeletons; prosthetics; rehabilitation robotics; wearables.
Copyright © 2022 Laschowski, McNally, Wong and McPhee.
Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Figures






Similar articles
-
StairNet: visual recognition of stairs for human-robot locomotion.Biomed Eng Online. 2024 Feb 15;23(1):20. doi: 10.1186/s12938-024-01216-0. Biomed Eng Online. 2024. PMID: 38360664 Free PMC article. Review.
-
Computer Vision and Deep Learning for Environment-Adaptive Control of Robotic Lower-Limb Exoskeletons.Annu Int Conf IEEE Eng Med Biol Soc. 2021 Nov;2021:4631-4635. doi: 10.1109/EMBC46164.2021.9630064. Annu Int Conf IEEE Eng Med Biol Soc. 2021. PMID: 34892246
-
Stair Recognition for Robotic Exoskeleton Control using Computer Vision and Deep Learning.IEEE Int Conf Rehabil Robot. 2022 Jul;2022:1-6. doi: 10.1109/ICORR55369.2022.9896501. IEEE Int Conf Rehabil Robot. 2022. PMID: 36176138
-
Preliminary Design of an Environment Recognition System for Controlling Robotic Lower-Limb Prostheses and Exoskeletons.IEEE Int Conf Rehabil Robot. 2019 Jun;2019:868-873. doi: 10.1109/ICORR.2019.8779540. IEEE Int Conf Rehabil Robot. 2019. PMID: 31374739
-
Human activity recognition using tools of convolutional neural networks: A state of the art review, data sets, challenges, and future prospects.Comput Biol Med. 2022 Oct;149:106060. doi: 10.1016/j.compbiomed.2022.106060. Epub 2022 Sep 1. Comput Biol Med. 2022. PMID: 36084382 Review.
Cited by
-
Automatic Stub Avoidance for a Powered Prosthetic Leg Over Stairs and Obstacles.IEEE Trans Biomed Eng. 2024 May;71(5):1499-1510. doi: 10.1109/TBME.2023.3340628. Epub 2024 Apr 22. IEEE Trans Biomed Eng. 2024. PMID: 38060364 Free PMC article.
-
Editorial: Next Generation User-Adaptive Wearable Robots.Front Robot AI. 2022 Jun 22;9:920655. doi: 10.3389/frobt.2022.920655. eCollection 2022. Front Robot AI. 2022. PMID: 35899075 Free PMC article. No abstract available.
-
Control strategies used in lower limb exoskeletons for gait rehabilitation after brain injury: a systematic review and analysis of clinical effectiveness.J Neuroeng Rehabil. 2023 Feb 19;20(1):23. doi: 10.1186/s12984-023-01144-5. J Neuroeng Rehabil. 2023. PMID: 36805777 Free PMC article.
-
Continuous A-Mode Ultrasound-Based Prediction of Transfemoral Amputee Prosthesis Kinematics Across Different Ambulation Tasks.IEEE Trans Biomed Eng. 2024 Jan;71(1):56-67. doi: 10.1109/TBME.2023.3292032. Epub 2023 Dec 22. IEEE Trans Biomed Eng. 2024. PMID: 37428665 Free PMC article.
-
StairNet: visual recognition of stairs for human-robot locomotion.Biomed Eng Online. 2024 Feb 15;23(1):20. doi: 10.1186/s12938-024-01216-0. Biomed Eng Online. 2024. PMID: 38360664 Free PMC article. Review.
References
-
- Canziani A., Paszke A., Culurciello E. (2016). An Analysis of Deep Neural Network Models for Practical Applications. arXiv [Preprint]. arXiv:1605.07678.
-
- Chollet F.. (2016). Xception: Deep Learning With Depthwise Separable Convolutions. arXiv [Preprint]. arXiv:1610.02357.
-
- Da Silva R. L., Starliper N., Zhong B., Huang H. H., Lobaton E. (2020). Evaluation of Embedded Platforms for Lower Limb Prosthesis With Visual Sensing Capabilities. arXiv [Preprint]. arXiv:2006.15224.
-
- Deng J., Dong W., Socher R., Li L. J., Li K., Fei-Fei L. (2009). ImageNet: A large-scale hierarchical image database, in IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (Miami, FL: IEEE; ), 248–255.
-
- Diaz J. P., Da Silva R. L., Zhong B., Huang H. H., Lobaton E. (2018). Visual terrain identification and surface inclination estimation for improving human locomotion with a lower-limb prosthetic, in Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (Honolulu, HI: IEEE; ), 1817–1820. - PubMed
LinkOut - more resources
Full Text Sources