Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Nov 27;22(23):9222.
doi: 10.3390/s22239222.

Indoor 3D Reconstruction of Buildings via Azure Kinect RGB-D Camera

Affiliations

Indoor 3D Reconstruction of Buildings via Azure Kinect RGB-D Camera

Chaimaa Delasse et al. Sensors (Basel). .

Abstract

With the development of 3D vision techniques, RGB-D cameras are increasingly used to allow easier and cheaper access to the third dimension. In this paper, we focus on testing the potential of the Kinect Azure RGB-D camera in the 3D reconstruction of indoor scenes. First, a series of investigations of the hardware was performed to evaluate its accuracy and precision. The results show that the measurements made with the Azure could be exploited for close-range survey applications. Second, we performed a methodological workflow for indoor reconstruction based on the Open3D framework, which was applied to two different indoor scenes. Based on the results, we can state that the quality of 3D reconstruction significantly depends on the architecture of the captured scene. This was supported by a comparison of the point cloud from the Kinect Azure with that from a terrestrial laser scanner and another from a mobile laser scanner. The results show that the average differences do not exceed 8 mm, which confirms that the Kinect Azure can be considered a 3D measurement system at least as reliable as a mobile laser scanner.

Keywords: 3D indoor reconstruction; Azure Kinect; MLS; RGB-D; TLS.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Overview of the general workflow.
Figure 2
Figure 2
Indoor office scene used for testing image averaging.
Figure 3
Figure 3
Operating modus for testing the influence of different ranges on measurement accuracy.
Figure 4
Figure 4
Design and parameters of the ChArUco board OpenCV.
Figure 5
Figure 5
Workflow of geometric calibration on OpenCV.
Figure 6
Figure 6
Diagram of the reconstruction of a 3D point clouds from RGB-D data on Open3D.
Figure 7
Figure 7
Workflow of data acquisition and comparison of point clouds two by two based on absolute distances C2C.
Figure 8
Figure 8
Evolution of standard deviations [in mm] of measurements according to the number of averaged frames.
Figure 9
Figure 9
Evolution of the standard deviation of a pixel as a function of the number of averaged frames.
Figure 10
Figure 10
Comparison of the point cloud extracted from the Kinect Azure to the cloud from the TLS: (a) the differences between the two clouds [in m], (b) histogram of the values [in m].
Figure 11
Figure 11
Schema of the difference between the distance measured by the Kinect and the true horizontal distance from a point on the ground.
Figure 12
Figure 12
Evolution of the deviations, in mm, from the “true” distances.
Figure 13
Figure 13
Evolution of standard deviations of measurements as a function of distance [in mm].
Figure 14
Figure 14
Result of the indoor 3D reconstruction of the room: (a) global view of the room from the top; (bd) focus on corners and objects.
Figure 15
Figure 15
Results of indoor 3D reconstruction of the office scene. (a) by keeping the windows uncovered; (b) by covering the windows.
Figure 16
Figure 16
Odometry drift effect highlighted by superposing point clouds from TLS (green) and point clouds from Kinect Azure (purple). (a) Top view of the point clouds, indicating the start point and the trajectory. (b) Perspective view. (c) Zoom on a corner of the room to highlight the odometry drift.
Figure 17
Figure 17
Means and standard deviations of C2C distances between the point clouds: (a) TLS vs. MLS, (b) TLS vs. Kinect Azure, (c) MLS vs. Kinect Azure.
Figure 18
Figure 18
Comparison of the intersection of the wall with the ground. (a) TLS (white) vs. Kinect Azure (color), (b) MLS (white) vs. Kinect Azure (color).
Figure 19
Figure 19
Multipath phenomenon observed on the intersection of two walls at the upper section corner of a room, as indicated by the red arrows. Pixels in black have zero depth values: depth image (left), passive infrared image (right). Microsoft.

References

    1. Macher H., Landes T., Grussenmeyer P. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Volume II-5-W3. Copernicus GmbH; Taipei, Taiwan: 2015. Point Clouds Segmentation as Base for As-Built BIM Creation; pp. 191–197.
    1. Henry P., Krainin M., Herbst E., Ren X., Fox D. RGB-D Mapping: Using Kinect-Style Depth Cameras for Dense 3D Modeling of Indoor Environments. Int. J. Robot. Res. 2012;31:647–663. doi: 10.1177/0278364911434148. - DOI
    1. Li Y., Li W., Tang S., Darwish W., Hu Y., Chen W. Automatic Indoor As-Built Building Information Models Generation by Using Low-Cost RGB-D Sensors. Sensors. 2020;20:293. doi: 10.3390/s20010293. - DOI - PMC - PubMed
    1. Lachat E., Landes T., Grussenmeyer P. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Volume XLI-B5. Copernicus GmbH; Prague, Czech Republic: 2016. Combination Of Tls Point Clouds And 3d Data From Kinect V2 Sensor To Complete Indoor Models; pp. 659–666.
    1. Tölgyessy M., Dekan M., Chovanec Ľ., Hubinský P. Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2. Sensors. 2021;21:413. doi: 10.3390/s21020413. - DOI - PMC - PubMed

LinkOut - more resources