StairNet: visual recognition of stairs for human-robot locomotion
- PMID: 38360664
- PMCID: PMC10870468
- DOI: 10.1186/s12938-024-01216-0
StairNet: visual recognition of stairs for human-robot locomotion
Abstract
Human-robot walking with prosthetic legs and exoskeletons, especially over complex terrains, such as stairs, remains a significant challenge. Egocentric vision has the unique potential to detect the walking environment prior to physical interactions, which can improve transitions to and from stairs. This motivated us to develop the StairNet initiative to support the development of new deep learning models for visual perception of real-world stair environments. In this study, we present a comprehensive overview of the StairNet initiative and key research to date. First, we summarize the development of our large-scale data set with over 515,000 manually labeled images. We then provide a summary and detailed comparison of the performances achieved with different algorithms (i.e., 2D and 3D CNN, hybrid CNN and LSTM, and ViT networks), training methods (i.e., supervised learning with and without temporal data, and semi-supervised learning with unlabeled images), and deployment methods (i.e., mobile and embedded computing), using the StairNet data set. Finally, we discuss the challenges and future directions. To date, our StairNet models have consistently achieved high classification accuracy (i.e., up to 98.8%) with different designs, offering trade-offs between model accuracy and size. When deployed on mobile devices with GPU and NPU accelerators, our deep learning models achieved inference speeds up to 2.8 ms. In comparison, when deployed on our custom-designed CPU-powered smart glasses, our models yielded slower inference speeds of 1.5 s, presenting a trade-off between human-centered design and performance. Overall, the results of numerous experiments presented herein provide consistent evidence that StairNet can be an effective platform to develop and study new deep learning models for visual perception of human-robot walking environments, with an emphasis on stair recognition. This research aims to support the development of next-generation vision-based control systems for robotic prosthetic legs, exoskeletons, and other mobility assistive technologies.
Keywords: Computer vision; Deep learning; Exoskeletons; Prosthetics; Wearable robotics.
© 2024. The Author(s).
Conflict of interest statement
The authors declare that they have no competing interests.
Figures






Similar articles
-
Stair Recognition for Robotic Exoskeleton Control using Computer Vision and Deep Learning.IEEE Int Conf Rehabil Robot. 2022 Jul;2022:1-6. doi: 10.1109/ICORR55369.2022.9896501. IEEE Int Conf Rehabil Robot. 2022. PMID: 36176138
-
Environment Classification for Robotic Leg Prostheses and Exoskeletons Using Deep Convolutional Neural Networks.Front Neurorobot. 2022 Feb 4;15:730965. doi: 10.3389/fnbot.2021.730965. eCollection 2021. Front Neurorobot. 2022. PMID: 35185507 Free PMC article.
-
Preliminary Design of an Environment Recognition System for Controlling Robotic Lower-Limb Prostheses and Exoskeletons.IEEE Int Conf Rehabil Robot. 2019 Jun;2019:868-873. doi: 10.1109/ICORR.2019.8779540. IEEE Int Conf Rehabil Robot. 2019. PMID: 31374739
-
An Overview of Machine Learning within Embedded and Mobile Devices-Optimizations and Applications.Sensors (Basel). 2021 Jun 28;21(13):4412. doi: 10.3390/s21134412. Sensors (Basel). 2021. PMID: 34203119 Free PMC article. Review.
-
A review of lower extremity assistive robotic exoskeletons in rehabilitation therapy.Crit Rev Biomed Eng. 2013;41(4-5):343-63. doi: 10.1615/critrevbiomedeng.2014010453. Crit Rev Biomed Eng. 2013. PMID: 24941413 Review.
Cited by
-
Advancements in Ocular Neuro-Prosthetics: Bridging Neuroscience and Information and Communication Technology for Vision Restoration.Biology (Basel). 2025 Jan 28;14(2):134. doi: 10.3390/biology14020134. Biology (Basel). 2025. PMID: 40001902 Free PMC article. Review.
-
Special collection in association with the 2023 International Conference on aging, innovation and rehabilitation.Biomed Eng Online. 2024 May 21;23(1):49. doi: 10.1186/s12938-024-01243-x. Biomed Eng Online. 2024. PMID: 38773592 Free PMC article. No abstract available.
References
-
- Dashkovets A, Laschowski B. Reinforcement learning for control of human locomotion in simulation. bioRxiv. 2023;55:142. doi: 10.1101/2023.12.19.572447. - DOI
-
- Zhang K, de Silva CW, Fu C. Sensor fusion for predictive control of human-prosthesis-environment dynamics in assistive walking: a survey. arXiv. 2019 doi: 10.48550/arXiv.1903.07674. - DOI
-
- Patla AE. Understanding the roles of vision in the control of human locomotion. Gait Posture. 1997 doi: 10.1016/S0966-6362(96)01109-5. - DOI
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources