Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Sep 16;18(9):3122.
doi: 10.3390/s18093122.

A Versatile Method for Depth Data Error Estimation in RGB-D Sensors

Affiliations

A Versatile Method for Depth Data Error Estimation in RGB-D Sensors

Elizabeth V Cabrera et al. Sensors (Basel). .

Abstract

We propose a versatile method for estimating the RMS error of depth data provided by generic 3D sensors with the capability of generating RGB and depth (D) data of the scene, i.e., the ones based on techniques such as structured light, time of flight and stereo. A common checkerboard is used, the corners are detected and two point clouds are created, one with the real coordinates of the pattern corners and one with the corner coordinates given by the device. After a registration of these two clouds, the RMS error is computed. Then, using curve fittings methods, an equation is obtained that generalizes the RMS error as a function of the distance between the sensor and the checkerboard pattern. The depth errors estimated by our method are compared to those estimated by state-of-the-art approaches, validating its accuracy and utility. This method can be used to rapidly estimate the quality of RGB-D sensors, facilitating robotics applications as SLAM and object recognition.

Keywords: RGB-D sensors; RMS error; accuracy.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

Figures

Figure 1
Figure 1
Data acquisition scheme.
Figure 2
Figure 2
Correspondence between RGB image and depth map.
Figure 3
Figure 3
Image projection of the ideal and estimated point.
Figure 4
Figure 4
RMS error for (a) Kinect v1, (b) Kinect v2 and (c) ZED, represented for polynomial and exponential models.
Figure 5
Figure 5
Comparison between methods to estimate the depth error in the Kinect v1. (a) Comparison of our method with Rauscher approach [2], using the results of the depth error in terms of RMS (eRMS), and (b) comparison of our proposal with Khoshelham approach [5] using the standard deviation (σZ) to represent the error.
Figure 6
Figure 6
Dispersion graphs of Kinect v1 depth errors curves, (a) Rauscher et al. [2] vs. our RMS error (eRMS) and (b) Khoshelham and Elberink [5] vs. our standard deviation of the depth error (σZ).
Figure 7
Figure 7
Comparison of the average RMS error in meters (for each pixel in a central window of 150 × 150) of 300 depth maps captured with Kinect v1 (640 × 480 px), Kinect v2 (512 × 424 px) and ZED camera (672 × 376 px), (a,c,e) before and (b,d,f) after a simple correction process. The more yellow the pixels are, the more RMS error exists.
Figure 8
Figure 8
Registration example of two point clouds with known correspondences (captured with the Kinect v1), (a) before and (b) after correction. The source point cloud (black) is captured at 0.5 m and the target cloud (cyan) is taken at 4 m away from the sensor.

References

    1. Fankhauser P., Bloesch M., Rodriguez D., Kaestner R., Hutter M., Siegwart R. Kinect v2 for mobile robot navigation: Evaluation and modeling; Proceedings of the 2015 International Conference on Advanced Robotics (ICAR); Istanbul, Turkey. 27–31 July 2015; pp. 388–394. - DOI
    1. Rauscher G., Dube D., Zell A. A Comparison of 3D Sensors for Wheeled Mobile Robots. In: Menegatti E., Michael N., Berns K., Yamaguchi H., editors. Intelligent Autonomous Systems 13. Springer International Publishing; Cham, Switzerland: 2016. pp. 29–41.
    1. He Y., Liang B., Zou Y., He J., Yang J. Depth errors analysis and correction for Time-of-Flight (ToF) cameras. Sensors. 2017;17:92. doi: 10.3390/s17010092. - DOI - PMC - PubMed
    1. Chang C., Chatterjee S. Quantization error analysis in stereo vision; Proceedings of the Conference Record of the Twenty-Sixth Asilomar Conference on Signals, Systems Computers; Pacific Grove, CA, USA. 26–28 October 1992; pp. 1037–1041. - DOI
    1. Khoshelham K., Elberink S.O. Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications. Sensors. 2012;12:1437–1454. doi: 10.3390/s120201437. - DOI - PMC - PubMed

LinkOut - more resources