Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Dec 10;22(24):9690.
doi: 10.3390/s22249690.

Real-Time Human Activity Recognition with IMU and Encoder Sensors in Wearable Exoskeleton Robot via Deep Learning Networks

Affiliations

Real-Time Human Activity Recognition with IMU and Encoder Sensors in Wearable Exoskeleton Robot via Deep Learning Networks

Ismael Espinoza Jaramillo et al. Sensors (Basel). .

Abstract

Wearable exoskeleton robots have become a promising technology for supporting human motions in multiple tasks. Activity recognition in real-time provides useful information to enhance the robot's control assistance for daily tasks. This work implements a real-time activity recognition system based on the activity signals of an inertial measurement unit (IMU) and a pair of rotary encoders integrated into the exoskeleton robot. Five deep learning models have been trained and evaluated for activity recognition. As a result, a subset of optimized deep learning models was transferred to an edge device for real-time evaluation in a continuous action environment using eight common human tasks: stand, bend, crouch, walk, sit-down, sit-up, and ascend and descend stairs. These eight robot wearer's activities are recognized with an average accuracy of 97.35% in real-time tests, with an inference time under 10 ms and an overall latency of 0.506 s per recognition using the selected edge device.

Keywords: deep learning networks; encoders; inertial measurement unit; real-time human activity recognition; wearable exoskeleton robot.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
The wearable robot and implemented HAR system: (a) Wearable exoskeleton robot, (b) Data collection and preprocesses, (c) Deep learning models for HAR, (d) Hardware platforms, and (e) Recognized activities.
Figure 2
Figure 2
A scenario for continuous human activities of the robot wearer.
Figure 3
Figure 3
Deep Learning model structure: (a) CNN, (b) RNN, (c) LSTM, (d) BiLSTM, (e) GRU.
Figure 4
Figure 4
A sample confusion matrix with the Bi-LSTM-2L model and epoch dataset.
Figure 5
Figure 5
Continuous real-time HAR results from two subjects: (a) S1 and (b) S2.
Figure 6
Figure 6
Recognition examples and screenshots with HAR system results during continuous real-time tests.

References

    1. Huo W., Mohammed S., Moreno J.C., Amirat Y. Lower Limb Wearable Robots for Assistance and Rehabilitation: A State of the Art. IEEE Syst. J. 2016;10:1068–1081. doi: 10.1109/JSYST.2014.2351491. - DOI
    1. Chen K., Zhang D., Yao L., Guo B., Yu Z., Liu Y. Deep Learning for Sensor-Based Human Activity Recognition: Overview, Challenges, and Opportunities. ACM Comput. Surv. 2021;54:77. doi: 10.1145/3447744. - DOI
    1. de Looze M.P., Bosch T., Krause F., Stadler K.S., O’Sullivan L.W. Exoskeletons for Industrial Application and Their Potential Effects on Physical Work Load. Ergonomics. 2016;59:671–681. doi: 10.1080/00140139.2015.1081988. - DOI - PubMed
    1. Kuschan J., Burgdorff M., Filaretov H., Krüger J. IOP Conference Series: Materials Science and Engineering. Volume 1140. IOP Publishing; Bristol, UK: 2021. Inertial Measurement Unit Based Human Action Recognition for Soft-Robotic Exoskeleton; p. 012020. - DOI
    1. Tucker M.R., Olivier J., Pagel A., Bleuler H., Bouri M., Lambercy O., Millán J.D.R., Riener R., Vallery H., Gassert R. Control Strategies for Active Lower Extremity Prosthetics and Orthotics: A Review. J. Neuroeng. Rehabil. 2015;12:1. doi: 10.1186/1743-0003-12-1. - DOI - PMC - PubMed