Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2025 Jun 20;25(13):3838.
doi: 10.3390/s25133838.

A Review on UAS Trajectory Estimation Using Decentralized Multi-Sensor Systems Based on Robotic Total Stations

Affiliations
Review

A Review on UAS Trajectory Estimation Using Decentralized Multi-Sensor Systems Based on Robotic Total Stations

Lucas Dammert et al. Sensors (Basel). .

Abstract

In our contribution, we conduct a thematic literature review on trajectory estimation using a decentralized multi-sensor system based on robotic total stations (RTS) with a focus on unmanned aerial system (UAS) platforms. While RTS are commonly used for trajectory estimation in areas where GNSS is not sufficiently accurate or is unavailable, they are rarely used for UAS trajectory estimation. Extending the RTS with integrated camera images allows for UAS pose estimation (position and orientation). We review existing research on the entire RTS measurement processes, including time synchronization, atmospheric refraction, prism interaction, and RTS-based image evaluation. Additionally, we focus on integrated trajectory estimation using UAS onboard measurements such as IMU and laser scanning data. Although many existing articles address individual steps of the decentralized multi-sensor system, we demonstrate that a combination of existing works related to UAS trajectory estimation and RTS calibration is needed to allow for trajectory estimation at sub-cm and sub-0.01 gon accuracies, and we identify the challenges that must be addressed. Investigations into the use of RTS for kinematic tasks must be extended to realistic distances (approx. 300-500 m) and speeds (>2.5 m s-1). In particular, image acquisition with the integrated camera must be extended by a time synchronization approach. As to the estimation of UAS orientation based on RTS camera images, the results of initial simulation studies must be validated by field tests, and existing approaches for integrated trajectory estimation must be adapted to optimally integrate RTS data.

Keywords: 6-DoF trajectory estimation; UAV; image-assisted total station; sensor synchronization.

PubMed Disclaimer

Conflict of interest statement

Author David Monetti is part of the company Skyability GmbH. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
Decentralized multi-sensor system: Ground segment consisting of multiple IATS and kinematic segment of UAS with IMU, GNSS antenna and receiver, and mapping sensor, e.g., a camera. The letters refer to Table 1, where A relates to the IATS synchronization process and data recording, B to the interaction of the measurement with the atmosphere, C to the camera image acquisition and D to the processing of the acquired image for the orientation estimation. Letter E relates to the trajectory estimation using IMU data in addition to the IATS measurements and F relates to the integrated trajectory estimation using IATS, IMU and point cloud information in the form of correspondences.
Figure 2
Figure 2
Steps of temporal calibration routine [5].
Figure 3
Figure 3
Kinematic formulation of polar measurements using time-dependent azimuth R(t), distance D(t), and their derivatives ω(t)=dRdt and vd(t)=dDdt. The angle between the line of sight and the moving direction is denoted by v˘(t) [5].
Figure 4
Figure 4
Forward intersection with two RTS, adapted from [64].
Figure 5
Figure 5
Principle of trajectory connection based on 3D correspondence detection. Two misaligned LiDAR point clouds (blue and orange) are related to two overlapping flight lines, adapted from [17].
Figure 6
Figure 6
Standard kinematic mapping processing pipeline with trajectory-level error modeling [3].
Figure 7
Figure 7
Holistic kinematic mapping processing pipeline with sensor-level error modeling [3].

References

    1. Dreier A., Janßen J., Kuhlmann H., Klingbeil L. Quality Analysis of Direct Georeferencing in Aspects of Absolute Accuracy and Precision for a UAV-Based Laser Scanning System. Remote Sens. 2021;13:3564. doi: 10.3390/rs13183564. - DOI
    1. Skaloud J., Lichti D. Rigorous approach to bore-sight self-calibration in airborne laser scanning. ISPRS J. Photogramm. Remote Sens. 2006;61:47–59. doi: 10.1016/j.isprsjprs.2006.07.003. - DOI
    1. Pöppl F., Neuner H., Mandlburger G., Pfeifer N. Integrated trajectory estimation for 3D kinematic mapping with GNSS, INS and imaging sensors: A framework and review. ISPRS J. Photogramm. Remote Sens. 2023;196:287–305. doi: 10.1016/j.isprsjprs.2022.12.022. - DOI
    1. Subirana J.S., Zornoza J.J., Hernández-Pajares M. Time References in GNSS. 2011. [(accessed on 15 May 2025)]. Available online: https://gssc.esa.int/navipedia/index.php/Time_References_in_GNSS.
    1. Thalmann T., Neuner H. Temporal calibration and synchronization of robotic total stations for kinematic multi-sensor-systems. J. Appl. Geod. 2021;15:13–30. doi: 10.1515/jag-2019-0070. - DOI

LinkOut - more resources