Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016:2016:5058171.
doi: 10.1155/2016/5058171. Epub 2016 Jun 15.

Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles

Affiliations

Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles

Camilo Cortés et al. Appl Bionics Biomech. 2016.

Abstract

In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Robotic and VR-based rehabilitation.
Figure 2
Figure 2
Components of the GH joint angles estimation system: (a) human kinematic model, (b) exoskeleton kinematic model, (c) marker-based optical motion capture system, and (d) hybrid GH joint angles estimation system.
Figure 3
Figure 3
(a) Schematic diagram of the hybrid GH joint angles estimation system and (b) high-level operation of the system.
Figure 4
Figure 4
Schematic diagram of the iterative estimation of the pose of the markers.
Figure 5
Figure 5
Estimation of disk coordinates in color image. (a) Simulated RBG image, (b) result of the color segmentation (zoomed image), and (c) result of the blob extraction (zoomed image).
Figure 6
Figure 6
Schematic diagram of the iterative estimation of the pose of the cameras.
Figure 7
Figure 7
Schematic diagram of the iterative estimation of the pose of the markers w.r.t. the exoskeleton CS.
Figure 8
Figure 8
Schematic diagram of the iterative estimation of the upper arm pose.
Figure 9
Figure 9
Coordinate systems for the upper arm pose estimation.
Figure 10
Figure 10
GH joint movements: (a) shoulder flexion-extension (SFE), (b) shoulder horizontal abduction-adduction (SAbAd), and (c) shoulder internal rotation (SIR).
Figure 11
Figure 11
Sensitivity analysis steps.
Figure 12
Figure 12
Sensitivity analysis. Coordinate systems of reference for the translations of (a) marker m0 and (b) marker m1.
Figure 13
Figure 13
Box plots of estimation errors in markers position and upper arm position and orientation for all movement datasets.
Figure 14
Figure 14
Results of the sensitivity analysis with the SAdAd movement dataset (qj: m0-x movement/m0-y movement/m0-z movement/m1-x movement/m1-y movement/m1-z movement).
Figure 15
Figure 15
Results of the sensitivity analysis with the SFE movement dataset (qj: m0-x movement/m0-y movement/m0-z movement/m1-x movement/m1-y movement/m1-z movement).
Figure 16
Figure 16
Results of the sensitivity analysis with the SIR movement dataset (qj: m0-x movement/m0-y movement/m0-z movement/m1-x movement/m1-y movement/m1-z movement).
Figure 17
Figure 17
Results of the sensitivity analysis with the COMB movement dataset (qj: m0-x movement/m0-y movement/m0-z movement/m1-x movement/m1-y movement/m1-z movement).

Similar articles

Cited by

References

    1. Patton J., Dawe G., Scharver C., Mussa-Ivaldi F., Kenyon R. Robotics and virtual reality: a perfect marriage for motor control research and rehabilitation. Assistive Technology. 2006;18(2):181–195. doi: 10.1080/10400435.2006.10131917. - DOI - PubMed
    1. Guidali M., Duschau-Wicke A., Broggi S., Klamroth-Marganska V., Nef T., Riener R. A robotic system to train activities of daily living in a virtual environment. Medical and Biological Engineering and Computing. 2011;49(10):1213–1223. doi: 10.1007/s11517-011-0809-0. - DOI - PubMed
    1. Frisoli A., Procopio C., Chisari C., et al. Positive effects of robotic exoskeleton training of upper limb reaching movements after stroke. Journal of NeuroEngineering and Rehabilitation. 2012;9(1, article no. 36) doi: 10.1186/1743-0003-9-36. - DOI - PMC - PubMed
    1. Gilliaux M., Lejeune T., Detrembleur C., Sapin J., Dehez B., Stoquart G. A robotic device as a sensitive quantitative tool to assess upper limb impairments in stroke patients: a preliminary prospective cohort study. Journal of Rehabilitation Medicine. 2012;44(3):210–217. doi: 10.2340/16501977-0926. - DOI - PubMed
    1. Zhou H., Hu H. Human motion tracking for rehabilitation-a survey. Biomedical Signal Processing and Control. 2008;3(1):1–18. doi: 10.1016/j.bspc.2007.09.001. - DOI

LinkOut - more resources