Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Apr 14;12(4):e0175813.
doi: 10.1371/journal.pone.0175813. eCollection 2017.

Validation of enhanced kinect sensor based motion capturing for gait assessment

Affiliations

Validation of enhanced kinect sensor based motion capturing for gait assessment

Björn Müller et al. PLoS One. .

Abstract

Optical motion capturing systems are expensive and require substantial dedicated space to be set up. On the other hand, they provide unsurpassed accuracy and reliability. In many situations however flexibility is required and the motion capturing system can only temporarily be placed. The Microsoft Kinect v2 sensor is comparatively cheap and with respect to gait analysis promising results have been published. We here present a motion capturing system that is easy to set up, flexible with respect to the sensor locations and delivers high accuracy in gait parameters comparable to a gold standard motion capturing system (VICON). Further, we demonstrate that sensor setups which track the person only from one-side are less accurate and should be replaced by two-sided setups. With respect to commonly analyzed gait parameters, especially step width, our system shows higher agreement with the VICON system than previous reports.

PubMed Disclaimer

Conflict of interest statement

Competing Interests: The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Architecture.
Connections between the Kinect sensors, Zotac mini computers running the client software and the server computer in a setup with six Kinect sensors.
Fig 2
Fig 2. Screenshots of the server program.
(A-B) Three-dimensional point clouds of six Kinect sensors after spatial calibration. Each point is represented as tiny cube with the color of the corresponding the pixel in the color image. The big colored cubes indicate the Kinect sensors. The red, green and blue lines attached to the colored boxes (sensors) indicate the axes of local coordinate system. Notice that the edge of the red coating on the floor is very straight, which illustrates the precise spatial calibration. (A) The marker in the very back is the marker with id 1 and represents the origin of the global coordinate system which is indicated by the grid. (B) A subject standing in the tracking volume to visualize the dimensions. (C) Screenshot of the live depth images view. The tracked body is highlighted. By walking through the tracking volume one can easily identify blind spots.
Fig 3
Fig 3. Markers and bipartite graph used for the spatial calibration.
(A) The shape of the markers is easily detectable in the RGB image; the white squares in the center encode the marker id [20]. The red circles indicate the salient points which have been used for defining the position and orientation of the marker. (B) Graph illustrating the “sees / is seen” relation (edges) between sensors (blue vertices) and markers (red vertices). For example, S3 cannot see M1 directly but indirectly via M2 and S2. A second possibility is via M2 and S5.
Fig 4
Fig 4. Arrangement of the Kinect sensors illustrating the overlapping tracking volumes.
Sensors were arranged using this pattern but then spatially calibrated to achieve high precision. The theoretical length of the tracking volume in walking direction using this setup is about 9 meters. The Kinect sensors were arranged in a way ensuring that the VICON tracking volume was completely covered.
Fig 5
Fig 5. Reconstruction of the body surface and averaged skeleton.
(A, B) Three-dimensional reconstruction of the body surface and the corresponding skeleton reconstruction using two sensors. The surface was estimated based on the 3d point clouds using the marching cubes algorithm in MeshLab [32]. Surface areas tracked by only one of the two sensors are highlighted in red (right) and blue (left). Corresponding Kinect skeleton joint positions estimates of the two sensors are shown as red and blue dots. The spatially averaged skeleton is indicated as black stick figure. (B) Magnification of the left lower leg. Notice, that the joint position estimates of the left sensor (blue) are closer to the surface which is only tracked by the left sensor (blue), correspondingly for the joint position estimates of the right sensor. (C) Averaged skeleton and joint position trajectories during walking obtained using six sensors.
Fig 6
Fig 6. Illustration of the analyzed spatial gait parameters.
Subjects were asked to always start walking with their left foot. The first step length left and first stride length right (gray) were excluded from the analysis, since these are generally shorter than the steps during actual walking (acceleration phase). Subjects did not stop at the end of the track volume (finish) so that there was no slowing down. Depending on the subjects’ individual step lengths there might be an unequal number of left and right steps. Parameters are assigned to the left or right foot depending on which foot was last placed on the floor, e.g. the first step length and step width are assigned to the left foot.
Fig 7
Fig 7. Identification of the foot events.
Foot events (black crosses) based on the ankle trajectories, here exemplified using the Kinect skeleton averaged across all sensors. Using the same procedure foot events were extracted from the VICON data. Top: Snapshots of the body posture during walking, bottom: ankle position of the left (red) and right (blue) foot in walking direction.

References

    1. Geerse DJ, Coolen BH, Roerdink M. Kinematic Validation of a Multi-Kinect v2 Instrumented 10-Meter Walkway for Quantitative Gait Assessments. PLoS ONE. 2015; 10: e0139913 10.1371/journal.pone.0139913 - DOI - PMC - PubMed
    1. Mentiplay BF, Perraton LG, Bower KJ, Pua Y-H, McGaw R, Heywood S, et al. Gait assessment using the Microsoft Xbox One Kinect: Concurrent validity and inter-day reliability of spatiotemporal and kinematic variables. J Biomech. 2015; 48: 2166–2170. 10.1016/j.jbiomech.2015.05.021 - DOI - PubMed
    1. Kaenchan S, Mongkolnam P, Watanapa B, Sathienpong S. Automatic multiple Kinect cameras setting for simple walking posture analysis. International Computer Science and Engineering Conference (ICSEC); 2013. pp. 245–249.
    1. Wang Q, Kurillo G, Ofli F, Bajcsy R. Evaluation of Pose Tracking Accuracy in the First and Second Generations of Microsoft Kinect. IEEE International Conference on Healthcare Informatics (ICHI); 2015. pp. 380–389.
    1. Clark RA, Pua Y-H, Oliveira CC, Bower KJ, Thilarajah S, McGaw R, et al. Reliability and concurrent validity of the Microsoft Xbox One Kinect for assessment of standing balance and postural control. Gait Posture. 2015; 42: 210–213. 10.1016/j.gaitpost.2015.03.005 - DOI - PubMed

LinkOut - more resources