Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2016 Mar 5;16(3):335.
doi: 10.3390/s16030335.

Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review

Affiliations
Review

Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review

Luis Pérez et al. Sensors (Basel). .

Abstract

In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works.

Keywords: 3D sensors; machine vision; part localization; perception for manipulation; robot guidance; robot pose.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Pinhole camera model [23].
Figure 2
Figure 2
From 3D to 2D. (a) Direct problem; (b) Inverse problem.
Figure 3
Figure 3
From 2D to 3D. (a) Homologous points; (b) Intersection of the projection lines.
Figure 4
Figure 4
Physical marks used in marker-based stereo vision. (a) Stickers; (b) Laser points.
Figure 5
Figure 5
Detection of marks in several images.
Figure 6
Figure 6
Feature tracking algorithms.
Figure 7
Figure 7
Projecting a pattern on the object.
Figure 8
Figure 8
Structured light typical patterns.
Figure 9
Figure 9
Blue LED sensor components.
Figure 10
Figure 10
Projected pattern in Light coding.
Figure 11
Figure 11
Laser triangulation.
Figure 12
Figure 12
Robot terms.
Figure 13
Figure 13
Laser tracker.
Figure 14
Figure 14
Multiple-sensor combination measuring system [73].
Figure 15
Figure 15
Bin picking [77].
Figure 16
Figure 16
Robot positioning using structured light [96].
Figure 17
Figure 17
Mobile robot using a blue light sensor for part localization. (a) Mobile robot; (b) Sensor operating.
Figure 18
Figure 18
Combining 3D models of robots with information from sensors [100].

References

    1. Deane P.M. The First Industrial Revolution. Cambridge University Press; Cambridge, UK: 1979.
    1. Kanji G.K. Total quality management: the second industrial revolution. Total Qual. Manag. Bus. Excell. 1990;1:3–12. doi: 10.1080/09544129000000001. - DOI
    1. Rifkin J. The third industrial revolution. Eng. Technol. 2008;3:26–27. doi: 10.1049/et:20080718. - DOI
    1. Kagermann H., Wahlster W., Helbig J. Recommendations for Implementing the Strategic Initiative Industrie 4.0: Final Report of the Industrie 4.0 Working Group. Forschungsunion; Berlin, Germany: 2013.
    1. Koeppe R. New industrial robotics: human and robot collaboration for the factory; Proceedings of the 2014 European Conference on Leading Enabling Technologies for Societal Challenges (LET’S 2014); Bologna, Italy. 29 September–1 October 2014.

Publication types