MUN-FRL: A Visual-Inertial-LiDAR Dataset for Aerial Autonomous Navigation and Mapping
- PMID: 40206426
- PMCID: PMC11977837
- DOI: 10.1177/02783649241238358
MUN-FRL: A Visual-Inertial-LiDAR Dataset for Aerial Autonomous Navigation and Mapping
Abstract
This paper presents a unique outdoor aerial visual-inertial-LiDAR dataset captured using a multi-sensor payload to promote the global navigation satellite system (GNSS)-denied navigation research. The dataset features flight distances ranging from 300 m to 5 km, collected using a DJI-M600 hexacopter drone and the National Research Council (NRC) Bell412 Advanced Systems Research Aircraft (ASRA). The dataset consists of hardware-synchronized monocular images, inertial measurement unit (IMU) measurements, 3D light detection and ranging (LiDAR) point-clouds, and high-precision real-time kinematic (RTK)-GNSS based ground truth. Nine data sequences were collected as robot operating system (ROS) bags over 100 mins of outdoor environment footage ranging from urban areas, highways, airports, hillsides, prairies, and waterfronts. The dataset was collected to facilitate the development of visual-inertial-LiDAR odometry and mapping algorithms, visual-inertial navigation algorithms, object detection, segmentation, and landing zone detection algorithms based on real-world drone and full-scale helicopter data. All the data sequences contain raw sensor measurements, hardware timestamps, and spatio-temporally aligned ground truth. The intrinsic and extrinsic calibrations of the sensors are also provided, along with raw calibration datasets. A performance summary of state-of-the-art methods applied on the data sequences is also provided.
Keywords: Dataset; aerial autonomy; drones; full-scale aircraft; visual-inertial-LiDAR odometry and mapping.
© The Author(s) 2024.
Conflict of interest statement
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Figures










References
-
- Adolfsson D, Magnusson M, Alhashimi A, et al. (2022) Lidar-level localization with radar? The CFEAR approach to accurate, fast, and robust large-scale radar odometry in diverse environments. IEEE Transactions on Robotics 39: 1476–1495. DOI: 10.1109/TRO.2022.3221302. - DOI
-
- Aurambout JP, Gkoumas K, Ciuffo B. (2019) Last mile delivery by drones: an estimation of viable market potential and access to citizens across European cities. European Transport Research Review 11(1): 21–30. DOI: 10.1186/S12544-019-0368-2/FIGURES/20. - DOI
-
- Boysen N, Fedtke S, Schwerdfeger S. (2020) Last-mile delivery concepts: a survey from an operational research perspective. Spectrum 43(1): 1–58. DOI:10.1007/S00291-020-00607-8. - DOI
-
- Burri M, Nikolic J, Gohl P, et al. (2016) The EuRoC micro aerial vehicle datasets. The International Journal of Robotics Research 35(10): 1157–1163. DOI: 10.1177/0278364915620033. - DOI
-
- Cao S, Cui TX. (2019) GitHub - HKUST-aerial-robotics/A-LOAM: advanced implementation of LOAM. https://github.com/HKUST-Aerial-Robotics/A-LOAM.
LinkOut - more resources
Full Text Sources