Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Dec 16;22(24):9908.
doi: 10.3390/s22249908.

A GNSS/INS/LiDAR Integration Scheme for UAV-Based Navigation in GNSS-Challenging Environments

Affiliations

A GNSS/INS/LiDAR Integration Scheme for UAV-Based Navigation in GNSS-Challenging Environments

Ahmed Elamin et al. Sensors (Basel). .

Abstract

Unmanned aerial vehicle (UAV) navigation has recently been the focus of many studies. The most challenging aspect of UAV navigation is maintaining accurate and reliable pose estimation. In outdoor environments, global navigation satellite systems (GNSS) are typically used for UAV localization. However, relying solely on GNSS might pose safety risks in the event of receiver malfunction or antenna installation error. In this research, an unmanned aerial system (UAS) employing the Applanix APX15 GNSS/IMU board, a Velodyne Puck LiDAR sensor, and a Sony a7R II high-resolution camera was used to collect data for the purpose of developing a multi-sensor integration system. Unfortunately, due to a malfunctioning GNSS antenna, there were numerous prolonged GNSS signal outages. As a result, the GNSS/INS processing failed after obtaining an error that exceeded 25 km. To resolve this issue and to recover the precise trajectory of the UAV, a GNSS/INS/LiDAR integrated navigation system was developed. The LiDAR data were first processed using the optimized LOAM SLAM algorithm, which yielded the position and orientation estimates. Pix4D Mapper software was then used to process the camera images in the presence of ground control points (GCPs), which resulted in the precise camera positions and orientations that served as ground truth. All sensor data were timestamped by GPS, and all datasets were sampled at 10 Hz to match those of the LiDAR scans. Two case studies were considered, namely complete GNSS outage and assistance from GNSS PPP solution. In comparison to the complete GNSS outage, the results for the second case study were significantly improved. The improvement is described in terms of RMSE reductions of approximately 51% and 78% for the horizontal and vertical directions, respectively. Additionally, the RMSE of the roll and yaw angles was reduced by 13% and 30%, respectively. However, the RMSE of the pitch angle was increased by about 13%.

Keywords: INS/LiDAR SLAM integration; UAV; integrated navigation system; optimized LOAM SLAM.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
UAS used in the study and its payload: (1) Sony α7R II camera (2) Velodyne PUCK LiDAR sensor (3) Applanix APX-15 GNSS/IMU board.
Figure 2
Figure 2
UAS flight lines above the study area.
Figure 3
Figure 3
GNSS data gaps due to an unexpected antenna malfunction.
Figure 4
Figure 4
Ground control points (GCPs) and checkpoints distribution in study area.
Figure 5
Figure 5
Camera poses estimated using Pix4D mapper software.
Figure 6
Figure 6
LiDAR SLAM trajectory generated using optimized LOAM SLAM algorithm [41].
Figure 7
Figure 7
LiDAR point cloud generated using optimized LOAM SLAM algorithm [41].
Figure 8
Figure 8
Graphical illustration of the data collection platform extrinsic transformation.
Figure 9
Figure 9
A block diagram for the GNSS/INS/LiDAR SLAM LC integration.
Figure 10
Figure 10
Complete GNSS signal outage: position errors (ENU).
Figure 11
Figure 11
Complete GNNS signal outage: errors of attitude angles (roll, pitch, and yaw).
Figure 12
Figure 12
Complete GNNS signal outage: comparison of trajectories.
Figure 13
Figure 13
GNSS assisted: position errors (ENU).
Figure 14
Figure 14
GNSS assisted: errors of attitude angles (roll, pitch, and yaw).
Figure 15
Figure 15
GNSS assisted: comparison of trajectories.

References

    1. Perez-Grau F.J., Ragel R., Caballero F., Viguria A., Ollero A. An architecture for robust UAV navigation in GPS-denied areas. J. Field Robot. 2018;35:121–145. doi: 10.1002/rob.21757. - DOI
    1. Balamurugan G., Valarmathi J., Naidu V. Survey on UAV navigation in GPS denied environments; Proceedings of the 2016 International Conference on Signal Processing, Communication, Power and Embedded System (SCOPES); Paralakhemundi, India. 3–5 October 2016.
    1. Martínez-Díaz M., Soriguera F. Autonomous vehicles: Theoretical and practical challenges. Transp. Res. Procedia. 2018;33:275–282. doi: 10.1016/j.trpro.2018.10.103. - DOI
    1. Samadzadegan F., Abdi G. Autonomous navigation of Unmanned Aerial Vehicles based on multi-sensor data fusion; Proceedings of the 20th Iranian Conference on Electrical Engineering (ICEE2012); Tehran, Iran. 15–17 May 2012; pp. 868–873.
    1. Kim J., Sukkarieh S. Autonomous airborne navigation in unknown terrain environments. IEEE Trans. Aerosp. Electron. Syst. 2004;40:1031–1045. doi: 10.1109/TAES.2004.1337472. - DOI