Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Apr 4;18(4):1085.
doi: 10.3390/s18041085.

Wide-Baseline Stereo-Based Obstacle Mapping for Unmanned Surface Vehicles

Affiliations

Wide-Baseline Stereo-Based Obstacle Mapping for Unmanned Surface Vehicles

Xiaozheng Mou et al. Sensors (Basel). .

Abstract

This paper proposes a wide-baseline stereo-based static obstacle mapping approach for unmanned surface vehicles (USVs). The proposed approach eliminates the complicated calibration work and the bulky rig in our previous binocular stereo system, and raises the ranging ability from 500 to 1000 m with a even larger baseline obtained from the motion of USVs. Integrating a monocular camera with GPS and compass information in this proposed system, the world locations of the detected static obstacles are reconstructed while the USV is traveling, and an obstacle map is then built. To achieve more accurate and robust performance, multiple pairs of frames are leveraged to synthesize the final reconstruction results in a weighting model. Experimental results based on our own dataset demonstrate the high efficiency of our system. To the best of our knowledge, we are the first to address the task of wide-baseline stereo-based obstacle mapping in a maritime environment.

Keywords: obstacle mapping; unmanned surface vehicle; visual odometry; wide-baseline stereo.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Proposed camera location on the boat.
Figure 2
Figure 2
Illustration of motion parallax.
Figure 3
Figure 3
Roll correction using the IMU and the horizon line.
Figure 4
Figure 4
Sea surface plane estimation from the horizon line.
Figure 5
Figure 5
Roll and pitch angles of camera obtained from the IMU and horizon line.
Figure 6
Figure 6
ORB feature detection, tracking, and matching. The straight lines in the image connecting the matched features.
Figure 7
Figure 7
Distribution of mapped feature points (red) in Seq-1. (The arrow points to the direction of North.) The left column shows the case of Feature Point #1; the right column shows the case of Feature Point #2. The top row shows the features on images with their distances displayed in meters. The middle row shows the reconstructed points using one pair of frames. The bottom row shows the reconstructed points using five pairs of frames.
Figure 8
Figure 8
Distribution of mapped feature points (red) in Seq-2. (The blue curve represents the trajectory of our USV; The arrow points to the direction of North.) The left column shows the case of Feature Point #1; the right column shows the case of Feature Point #2. The top row shows the features on images with their distances displayed in meters. The middle row shows the reconstructed points using one pair of frames. The bottom row shows the reconstructed points using ten pairs of frames.
Figure 9
Figure 9
Obstacle mapping result of Seq-1. The red points are the reconstructed feature points from obstacles; The blue curve represents the trajectory of our USV; The arrow points to the direction of North. (a) Triangulation using one frame pair; (b) triangulation using ten frame pairs.
Figure 10
Figure 10
Obstacle mapping result of Seq-2. The red points are the reconstructed feature points from obstacles; The blue curve represents the trajectory of our USV; The arrow points to the direction of North. (a) Triangulation using one frame pair; (b) triangulation using ten frame pairs.
Figure 11
Figure 11
The resulting obstacle map (middle) after a full moving loop of the USV (Seq-3). The corresponding obstacles in the original images are shown in the surrounding figures with an arrow linking each of them to the map. In the obstacle map, the blue circle represents the trajectory of the USV, and the red points represent the mapped feature points from the obstacles. The green rectangles are manually drawn to illustrate the stationary obstacles, while the yellow rectangles are manually drawn to show the distant obstacles with large mapping variances and the moving obstacles.

References

    1. Wang H., Mou W., Mou X., Yuan S., Ulun S., Yang S., Shin B. An Automatic Self-calibration Approach for Wide Baseline Stereo Cameras Using Sea Surface Images. Unmanned Syst. 2015;3:277–290. doi: 10.1142/S230138501540004X. - DOI
    1. Wang H., Mou X., Mou W., Yuan S., Ulun S., Yang S., Shin B. Vision Based Long Range Object Detection and Tracking for Unmanned Surface Vehicle; Proceedings of the 2015 IEEE 7th International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM); Siem Reap, Cambodia. 15–17 July 2015; pp. 101–105.
    1. Mou X., Wang H., Lim K. Scale-Adaptive Multiple-Obstacle Tracking with Occlusion Handling in Maritime Scenes; Proceedings of the 2016 12th IEEE International Conference on Control and Automation (ICCA); Kathmandu, Nepal. 1–3 June 2016; pp. 588–592.
    1. Shin B., Mou X., Mou W., Wang H. Vision-Based Navigation of An Unmanned Surface Vehicle with Object Detection and Tracking Abilities. Mach. Vis. Appl. 2018;29:95–112. doi: 10.1007/s00138-017-0878-7. - DOI
    1. Mur-Artal R., Montiel J.M.M., Tardos J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015;31:1147–1163. doi: 10.1109/TRO.2015.2463671. - DOI