Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2025 Aug 1;25(15):4745.
doi: 10.3390/s25154745.

Adaptive Covariance Matrix for UAV-Based Visual-Inertial Navigation Systems Using Gaussian Formulas

Affiliations

Adaptive Covariance Matrix for UAV-Based Visual-Inertial Navigation Systems Using Gaussian Formulas

Yangzi Cong et al. Sensors (Basel). .

Abstract

In a variety of UAV applications, visual-inertial navigation systems (VINSs) play a crucial role in providing accurate positioning and navigation solutions. However, traditional VINS struggle to adapt flexibly to varying environmental conditions due to fixed covariance matrix settings. This limitation becomes especially acute during high-speed drone operations, where motion blur and fluctuating image clarity can significantly compromise navigation accuracy and system robustness. To address these issues, we propose an innovative adaptive covariance matrix estimation method for UAV-based VINS using Gaussian formulas. Our approach enhances the accuracy and robustness of the navigation system by dynamically adjusting the covariance matrix according to the quality of the images. Leveraging the advanced Laplacian operator, detailed assessments of image blur are performed, thereby achieving precise perception of image quality. Based on these assessments, a novel mechanism is introduced for dynamically adjusting the visual covariance matrix using a Gaussian model according to the clarity of images in the current environment. Extensive simulation experiments across the EuRoC and TUM VI datasets, as well as the field tests, have validated our method, demonstrating significant improvements in navigation accuracy of drones in scenarios with motion blur. Our algorithm has shown significantly higher accuracy compared to the famous VINS-Mono framework, outperforming it by 18.18% on average, as well as the optimization rate of RMS, which reaches 65.66% for the F1 dataset and 41.74% for F2 in the field tests outdoors.

Keywords: adaptive covariance matrix; drones/UAV; image quality assessment; motion blur; visual–inertial navigation system.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Adaptive visual–inertial navigation architecture based on image quality judgment, where red marks the main contributions.
Figure 2
Figure 2
Illustration of the sliding window monocular VIO with re-localization.
Figure 3
Figure 3
Dataset sampling equipment. (a) Euroc. (b) TUM VI.
Figure 4
Figure 4
Dataset sampling environment. (a) Euroc Room. (b) Euroc V. (c) TUM VI Room.
Figure 5
Figure 5
Simulation of motion blur with 45°. (a) Original image 1. (b) Slow motion blur 1. (c) Fast motion blur 1. (d) Original image 2. (e) Slow motion blur 2. (f) Fast motion blur 2.
Figure 6
Figure 6
Image quality score statistics.
Figure 7
Figure 7
Gaussian fit to frequency of image quality scores.
Figure 8
Figure 8
Comparison of absolute pose error with each subfigure corresponding to results of different sequences.
Figure 9
Figure 9
Comparison of accuracy index with each subfigure corresponding to results of different sequences.
Figure 10
Figure 10
Dataset distributions and equipment specifications. (a) F1 dataset. (b) F2 dataset. (c) hardware setups.
Figure 11
Figure 11
Trajectory comparisons.
Figure 12
Figure 12
Results on F1 dataset (left: VINS-Mono, right: ours): (a) axial error, (b) global position error.
Figure 13
Figure 13
Results on F2 dataset (left: VINS-Mono, right: ours): (a) axial error, (b) global position error.

Similar articles

References

    1. Tang X., Yang L., Wang D., Li W., Xin D., Jia H. A collaborative navigation algorithm for UAV Ad Hoc network based on improved sequence quadratic programming and unscented Kalman filtering in GNSS denied area. Measurement. 2025;242:115977. doi: 10.1016/j.measurement.2024.115977. - DOI
    1. Li C., Wang J., Liu J., Shan J. Cooperative visual–range–inertial navigation for multiple unmanned aerial vehicles. IEEE Trans. Aerosp. Electron. Syst. 2023;59:7851–7865. doi: 10.1109/TAES.2023.3297555. - DOI
    1. Huang G. Visual-inertial navigation: A concise review; Proceedings of the 2019 International Conference on Robotics and Automation (ICRA); Montreal, QC, Canada. 20–24 May 2019; pp. 9572–9582.
    1. Motlagh N.H., Bagaa M., Taleb T. UAV-based IoT platform: A crowd surveillance use case. IEEE Commun. Mag. 2017;55:128–134. doi: 10.1109/MCOM.2017.1600587CM. - DOI
    1. Bisio I., Garibotto C., Haleem H., Lavagetto F., Sciarrone A. RF/WiFi-based UAV surveillance systems: A systematic literature review. Internet Things. 2024;26:101201. doi: 10.1016/j.iot.2024.101201. - DOI

LinkOut - more resources