Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Oct;43(12):1853-1866.
doi: 10.1177/02783649241238358. Epub 2024 Apr 16.

MUN-FRL: A Visual-Inertial-LiDAR Dataset for Aerial Autonomous Navigation and Mapping

Affiliations

MUN-FRL: A Visual-Inertial-LiDAR Dataset for Aerial Autonomous Navigation and Mapping

Ravindu G Thalagala et al. Int J Rob Res. 2024 Oct.

Abstract

This paper presents a unique outdoor aerial visual-inertial-LiDAR dataset captured using a multi-sensor payload to promote the global navigation satellite system (GNSS)-denied navigation research. The dataset features flight distances ranging from 300 m to 5 km, collected using a DJI-M600 hexacopter drone and the National Research Council (NRC) Bell412 Advanced Systems Research Aircraft (ASRA). The dataset consists of hardware-synchronized monocular images, inertial measurement unit (IMU) measurements, 3D light detection and ranging (LiDAR) point-clouds, and high-precision real-time kinematic (RTK)-GNSS based ground truth. Nine data sequences were collected as robot operating system (ROS) bags over 100 mins of outdoor environment footage ranging from urban areas, highways, airports, hillsides, prairies, and waterfronts. The dataset was collected to facilitate the development of visual-inertial-LiDAR odometry and mapping algorithms, visual-inertial navigation algorithms, object detection, segmentation, and landing zone detection algorithms based on real-world drone and full-scale helicopter data. All the data sequences contain raw sensor measurements, hardware timestamps, and spatio-temporally aligned ground truth. The intrinsic and extrinsic calibrations of the sensors are also provided, along with raw calibration datasets. A performance summary of state-of-the-art methods applied on the data sequences is also provided.

Keywords: Dataset; aerial autonomy; drones; full-scale aircraft; visual-inertial-LiDAR odometry and mapping.

PubMed Disclaimer

Conflict of interest statement

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Figures

Figure 1.
Figure 1.
MUN Sensor Payload Flight Test Configurations: NRC Bell 412 ASRA Nose Mount (Top/Bottom Left); MUN DJI-M600 air-frame mount top right.
Figure 2.
Figure 2.
Sensor payload, key coordinate frames, and dimensions.
Figure 3.
Figure 3.
Schematic view of sensor coordinate frames (red) and corresponding transformations (black).
Figure 4.
Figure 4.
Camera-LiDAR calibration methodology (Left: Initial calibration using Matlab toolbox, Right: Fine-tuning the calibration using dataset images.).
Figure 5.
Figure 5.
VI-LOAM results on the Bell412-6. Useful LiDAR points are only available at the beginning and at the end of the dataset.
Figure 6.
Figure 6.
VI-LOAM results on the Bell412-1 dataset. The terrain is predominantly flat since this is a low altitude flight over the main taxiway from the hanger.
Figure 7.
Figure 7.
PPS time synchronization scheme.
Figure 8.
Figure 8.
Trajectory estimation results of Bell412 sequences: Bell412-[1 to 3] (Top Row), Bell412-[4 and 5] (Bottom Row). The red line shows the ground truth trajectory. The presented results are obtained with real-time time-synchronized data and calibrated sensors using the VI-LOAM.
Figure 9.
Figure 9.
Trajectory estimation results of DJI-M600 sequences: Quarry-1, 2 and Lighthouse. The red line shows the ground truth trajectory. The presented results are obtained with real-time time-synchronized data and calibrated sensors using VI-LOAM.
Figure 10.
Figure 10.
Sample snapshots of scenes captured in each dataset sequence.

References

    1. Adolfsson D, Magnusson M, Alhashimi A, et al. (2022) Lidar-level localization with radar? The CFEAR approach to accurate, fast, and robust large-scale radar odometry in diverse environments. IEEE Transactions on Robotics 39: 1476–1495. DOI: 10.1109/TRO.2022.3221302. - DOI
    1. Aurambout JP, Gkoumas K, Ciuffo B. (2019) Last mile delivery by drones: an estimation of viable market potential and access to citizens across European cities. European Transport Research Review 11(1): 21–30. DOI: 10.1186/S12544-019-0368-2/FIGURES/20. - DOI
    1. Boysen N, Fedtke S, Schwerdfeger S. (2020) Last-mile delivery concepts: a survey from an operational research perspective. Spectrum 43(1): 1–58. DOI:10.1007/S00291-020-00607-8. - DOI
    1. Burri M, Nikolic J, Gohl P, et al. (2016) The EuRoC micro aerial vehicle datasets. The International Journal of Robotics Research 35(10): 1157–1163. DOI: 10.1177/0278364915620033. - DOI
    1. Cao S, Cui TX. (2019) GitHub - HKUST-aerial-robotics/A-LOAM: advanced implementation of LOAM. https://github.com/HKUST-Aerial-Robotics/A-LOAM.

LinkOut - more resources