Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Feb 16;19(4):807.
doi: 10.3390/s19040807.

A Hardware-Friendly Optical Flow-Based Time-to-Collision Estimation Algorithm

Affiliations

A Hardware-Friendly Optical Flow-Based Time-to-Collision Estimation Algorithm

Cong Shi et al. Sensors (Basel). .

Abstract

This work proposes a hardware-friendly, dense optical flow-based Time-to-Collision (TTC) estimation algorithm intended to be deployed on smart video sensors for collision avoidance. The algorithm optimized for hardware first extracts biological visual motion features (motion energies), and then utilizes a Random Forests regressor to predict robust and dense optical flow. Finally, TTC is reliably estimated from the divergence of the optical flow field. This algorithm involves only feed-forward data flows with simple pixel-level operations, and hence has inherent parallelism for hardware acceleration. The algorithm offers good scalability, allowing for flexible tradeoffs among estimation accuracy, processing speed and hardware resource. Experimental evaluation shows that the accuracy of the optical flow estimation is improved due to the use of Random Forests compared to existing voting-based approaches. Furthermore, results show that estimated TTC values by the algorithm closely follow the ground truth. The specifics of the hardware design to implement the algorithm on a real-time embedded system are laid out.

Keywords: biological visual features; motion energy; motion estimation; optical flow; spatiotemporal energy; time-to-collision.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Figures

Figure 1
Figure 1
The proposed hardware-friendly TTC estimation algorithm based on optical flow. (a) Algorithm flow overview. (b) Spatiotemporal filters for motion energy extraction. (c) The structure of a trained regression tree in the random forests.
Figure 2
Figure 2
Mean Absolute Errors (MAE) of optical flow computation for 1D horizontal translation at different speeds from −8 to 8 pixel/frame with a step of 0.5 pixel/frame.
Figure 3
Figure 3
Mean Absolute Errors (MAE) of optical flow computation for more complicated motion patterns. In the looming pattern, each image frame is generated by expanding its preceding frame at a constant expansion rate, and then cropping the frame to leave only the central 320 × 240 region within the frame.
Figure 4
Figure 4
TTC estimation results for synthetic looming sequences. (a) Three Middlebury images selected to generate synthetic looming sequences. (b) One of the synthetic looming sequences: the Army looming sequence. (c) Global averages of estimated TTCs of the three sequences.
Figure 5
Figure 5
TTC estimation results for a real-world looming sequence. (a) Image sequence captured from a moving camera approaching a stationary object. (b) Global average of estimated TTC for obstacle in the sequence.
Figure 5
Figure 5
TTC estimation results for a real-world looming sequence. (a) Image sequence captured from a moving camera approaching a stationary object. (b) Global average of estimated TTC for obstacle in the sequence.
Figure 6
Figure 6
The TTC map sequence for a real-world driving video clip, overlaid on the image frames. The video was captured with a dashboard camera from a car, initially moving at a velocity of about 20 km/h and then decelerating and coming to a complete halt. Colors trending toward red show shorter TTC values, whereas those trending toward blue show longer TTC values. TTC values longer than 5 s are shown as blue. (a) far from the observing car. (b) a little closer to the observing car at slightly decelerated velocity. (c) legend bar for TTC heat map values.
Figure 7
Figure 7
The conceptual diagram of pixel-stream pipeline hardware architecture for our TTC estimation algorithm.

Similar articles

Cited by

References

    1. Sanchez-Garcia A.J., Rios-Figueroa H.V., Marin-Hernandez A., Contreras-Vega G. Decision making for obstacle avoidance in autonomous mobile robots by time to contact and optical flow; Proceedings of the Decision Making for Obstacle Avoidance in Autonomous Mobile Robots by Time to Contact and Optical Flow; Cholula, Mexico. 25–27 February 2015; pp. 130–134.
    1. Zhang H., Zhao J. Bio-inspired vision based robot control using featureless estimations of time-to-contact. Bioinspir. Biomim. 2017;12:025001. doi: 10.1088/1748-3190/aa53c4. - DOI - PubMed
    1. Pundlik S., Tomasi M., Luo G. Collision detection for visually impaired from a body-mounted camera; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops; Portland, OR, USA. 23–28 June 2013; pp. 41–47.
    1. Pundlik S., Tomasi M., Moharrer M., Bowers A.R., Luo G. Preliminary Evaluation of a Wearable Camera-based Collision Warning Device for Blind Individuals. Optometry Vision Sci. 2018;95:747–756. doi: 10.1097/OPX.0000000000001264. - DOI - PubMed
    1. Alenyà G., Nègre A., Crowley J.L. Time to contact for obstacle avoidance; Proceedings of the 4th European Conference on Mobile Robots; Mlini/Dubrovnik, Croatia. 23–25 September 2009; pp. 19–24.

LinkOut - more resources