Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Jan 12;23(2):878.
doi: 10.3390/s23020878.

How the Processing Mode Influences Azure Kinect Body Tracking Results

Affiliations

How the Processing Mode Influences Azure Kinect Body Tracking Results

Linda Büker et al. Sensors (Basel). .

Abstract

The Azure Kinect DK is an RGB-D-camera popular in research and studies with humans. For good scientific practice, it is relevant that Azure Kinect yields consistent and reproducible results. We noticed the yielded results were inconsistent. Therefore, we examined 100 body tracking runs per processing mode provided by the Azure Kinect Body Tracking SDK on two different computers using a prerecorded video. We compared those runs with respect to spatiotemporal progression (spatial distribution of joint positions per processing mode and run), derived parameters (bone length), and differences between the computers. We found a previously undocumented converging behavior of joint positions at the start of the body tracking. Euclidean distances of joint positions varied clinically relevantly with up to 87 mm between runs for CUDA and TensorRT; CPU and DirectML had no differences on the same computer. Additionally, we found noticeable differences between two computers. Therefore, we recommend choosing the processing mode carefully, reporting the processing mode, and performing all analyses on the same computer to ensure reproducible results when using Azure Kinect and its body tracking in research. Consequently, results from previous studies with Azure Kinect should be reevaluated, and until then, their findings should be interpreted with caution.

Keywords: Azure Kinect; Azure Kinect Body Tracking SDK; body tracking; quality assurance; reproducibility; skeleton tracking.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure A1
Figure A1
X-, y-, and z-axes of the joint positions of PELVIS and FOOT_LEFT for all four processing modes on computer A. (a) X-, y-, z-axes for the joint position of PELVIS. (b) X-, y-, z-axes for the joint position of FOOT_LEFT.
Figure A2
Figure A2
X-, y-, and z-axes of the joint positions of PELVIS and FOOT_LEFT for all four processing modes for two different computers. The blue lines are from computer A and the orange from computer B. (a) X-, y-, z-axes for the joint position of PELVIS. (b) X-, y-, z-axes for the joint position of FOOT_LEFT.
Figure A3
Figure A3
Ellipsoids of the joint position of PELVIS for all four processing modes. The different colors represent different body tracking runs.
Figure A4
Figure A4
Ellipsoids of the joint position of KNEE_LEFT for all four processing modes. The different colors represent different body tracking runs.
Figure A5
Figure A5
Ellipsoids for the joint position of FOOT_LEFT for all four processing modes. The different colors represent different body tracking runs.
Figure 1
Figure 1
Overview of the experimental setup. (a) Schematic setup. In addition, the coordinate system of the depth camera is shown, which is tilted downwards by 6° with respect to the camera’s case. (b) Mannequin from the camera’s point of view in a windowless dark room (picture taken with the lights turned on). (c) Point cloud of the mannequin with overlaid body tracking. Screenshot of k4abt_simple_3d_viewer from the Azure Kinect Body Tracking SDK.
Figure 2
Figure 2
Schematic overview of the data processing and the three experiments.
Figure 3
Figure 3
X-, y-, z-axes of the first 90 body tracking frames using 100 body tracking runs for all four processing modes for joint positions of PELVIS and WRIST_LEFT. Note: the runs for CPU and DirectML yielded the same results for each run and, therefore, appear as a single line. (a) X-, y-, z-axes of the first 90 body tracking frames for the joint position of PELVIS. (b) X-, y-, z-axes of the first 90 body tracking frames for the joint position of WRIST_LEFT.
Figure 4
Figure 4
X-position data (red) of ELBOW_RIGHT, fitted exponential curve (blue), as well as one (purple) and four (green) times the half-life time of the fitted exponential curve.
Figure 5
Figure 5
X-, y-, and z-axes of the joint positions of PELVIS and FOOT_LEFT for all four processing modes—extract of relevant plots (Figures for all processing modes and axis are shown in Figure A1). (a) Examples of a stable, steady value range. (b) Examples of a switch between two steady value ranges.
Figure 6
Figure 6
Box plots of the ellipsoid volumes for all processing modes for the 100 body tracking runs. As the standard deviation for processing modes CPU and DirectML is zero, only the mean is shown for these modes. Note: the y-axis has a logarithmic scale.
Figure 7
Figure 7
Ellipsoids of the joint position of the PELVIS for processing mode CUDA (Figures for all processing modes are shown in Figure A3). The different colors represent different body tracking runs.
Figure 8
Figure 8
Ellipsoids of the joint position of KNEE_LEFT for relevant processing modes (Figures for all processing modes are shown in Figure A4). The different colors represent different body tracking runs.
Figure 9
Figure 9
Ellipsoids for the joint position of FOOT_LEFT for relevant processing modes (Figures for all processing modes are shown in Figure A5). The different colors represent different body tracking runs.
Figure 10
Figure 10
X-, y-, and z-y-plots of the joint positions seen from the frontal and left side perspective for all four processing modes.
Figure 10
Figure 10
X-, y-, and z-y-plots of the joint positions seen from the frontal and left side perspective for all four processing modes.
Figure 11
Figure 11
Euclidean distance with 100 body tracking runs using CUDA and TensorRT. (a) Euclidean distances with 100 body tracking runs using CUDA. (b) Euclidean distances with 100 body tracking runs using TensorRT.
Figure 12
Figure 12
Box plots of bone length using 100 body tracking runs for all four processing modes.
Figure 13
Figure 13
X-, y-, and z-axes of the joint positions of PELVIS and FOOT_LEFT for all four processing modes for two different computers—extract of relevant graphs (Figures for all processing modes and axis are shown in Figure A2). The blue lines represent computer A, and the orange ones computer B. (a) Examples of similar behavior between computers A and B. (b) Examples of very close steady value ranges for computers A and B. (c) Examples of a switch between two steady value ranges at different frames. (d) Examples of a switch between two steady value ranges just for one computer.
Figure 14
Figure 14
Box plots of bone length using 100 body tracking runs for all four processing modes on computers A and B.

References

    1. Guess T.M., Bliss R., Hall J.B., Kiselica A.M. Comparison of Azure Kinect overground gait spatiotemporal parameters to marker based optical motion capture. Gait Posture. 2022;96:130–136. doi: 10.1016/j.gaitpost.2022.05.021. - DOI - PubMed
    1. Ferraris C., Amprimo G., Masi G., Vismara L., Cremascoli R., Sinagra S., Pettiti G., Mauro A., Priano L. Evaluation of Arm Swing Features and Asymmetry during Gait in Parkinson’s Disease Using the Azure Kinect Sensor. Sensors. 2022;22:6282. doi: 10.3390/s22166282. - DOI - PMC - PubMed
    1. Alaoui H., Moutacalli M.T., Adda M. AI-Enabled High-Level Layer for Posture Recognition Using The Azure Kinect in Unity3D; Proceedings of the 2020 IEEE 4th International Conference on Image Processing, Applications and Systems (IPAS); Genova, Italy. 9–11 December 2020; pp. 155–161. - DOI
    1. Sekiguchi S., Li L., Ko N.Y., Choi W. Posture Recognition System using Depth Sensor; Proceedings of the 2021 21st International Conference on Control, Automation and Systems (ICCAS); Jeju, Korea. 12–15 October 2021; pp. 1463–1466. - DOI
    1. Microsoft Inc. Azure Kinect DK Hardware Specifications. [(accessed on 25 July 2022)]. Available online: https://docs.microsoft.com/en-us/azure/Kinect-dk/hardware-specification.