Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Jan 5;17(1):92.
doi: 10.3390/s17010092.

Depth Errors Analysis and Correction for Time-of-Flight (ToF) Cameras

Affiliations

Depth Errors Analysis and Correction for Time-of-Flight (ToF) Cameras

Ying He et al. Sensors (Basel). .

Abstract

Time-of-Flight (ToF) cameras, a technology which has developed rapidly in recent years, are 3D imaging sensors providing a depth image as well as an amplitude image with a high frame rate. As a ToF camera is limited by the imaging conditions and external environment, its captured data are always subject to certain errors. This paper analyzes the influence of typical external distractions including material, color, distance, lighting, etc. on the depth error of ToF cameras. Our experiments indicated that factors such as lighting, color, material, and distance could cause different influences on the depth error of ToF cameras. However, since the forms of errors are uncertain, it's difficult to summarize them in a unified law. To further improve the measurement accuracy, this paper proposes an error correction method based on Particle Filter-Support Vector Machine (PF-SVM). Moreover, the experiment results showed that this method can effectively reduce the depth error of ToF cameras to 4.6 mm within its full measurement range (0.5-5 m).

Keywords: SVM; ToF camera; depth error; error correction; error modeling; particle filter.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Development history of ToF cameras.
Figure 2
Figure 2
Principle of ToF cameras.
Figure 3
Figure 3
Experimental scene. (a) Experimental scene; (b) Camera bracket.
Figure 4
Figure 4
Influence of lighting on depth errors.
Figure 5
Figure 5
Influence of color on depth errors.
Figure 6
Figure 6
Four boards made of different materials.
Figure 7
Figure 7
Depth data of two sensors.
Figure 8
Figure 8
The measured cone.
Figure 9
Figure 9
Measurement errors of the cone.
Figure 10
Figure 10
Complex scene.
Figure 11
Figure 11
Depth images based on the point cloud of depth sensors.
Figure 12
Figure 12
Process of PF-SVM algorithm.
Figure 13
Figure 13
Depth error and error model.
Figure 14
Figure 14
Depth error correction results.
Figure 15
Figure 15
Depth error correction results of various colors and error model.
Figure 16
Figure 16
Depth error correction results and error model.

References

    1. Henry P., Krainin M., Herbst E., Ren X., Fox D. RGB-D mapping: Using kinect-style depth cameras for dense 3D modeling of indoor environments. Int. J. Robot. Res. 2012;31:647–663. doi: 10.1177/0278364911434148. - DOI
    1. Brachmann E., Krull A., Michel F., Gumhold S., Shotton J., Rother C. Learning 6D Object Pose Estimation Using 3D Object Coordinates. Volume 53. Springer; Heidelberg, Germany: 2014. pp. 151–173.
    1. Tong J., Zhou J., Liu L., Pan Z., Yan H. Scanning 3D full human bodies using kinects. IEEE Trans. Vis. Comput. Graph. 2012;18:643–650. doi: 10.1109/TVCG.2012.56. - DOI - PubMed
    1. Liu X., Fujimura K. Hand gesture recognition using depth data; Proceedings of the IEEE International Conference on Automatic Face and Gesture Recognition; Seoul, Korea. 17–19 May 2004; pp. 529–534.
    1. Foix S., Alenya G., Torras C. Lock-in time-of-flight (ToF) cameras: A survey. IEEE Sens. J. 2011;11:1917–1926. doi: 10.1109/JSEN.2010.2101060. - DOI