Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Jul 2;20(13):3707.
doi: 10.3390/s20133707.

Feature Sensing and Robotic Grasping of Objects with Uncertain Information: A Review

Affiliations
Review

Feature Sensing and Robotic Grasping of Objects with Uncertain Information: A Review

Chao Wang et al. Sensors (Basel). .

Abstract

As there come to be more applications of intelligent robots, their task object is becoming more varied. However, it is still a challenge for a robot to handle unfamiliar objects. We review the recent work on the feature sensing and robotic grasping of objects with uncertain information. In particular, we focus on how the robot perceives the features of an object, so as to reduce the uncertainty of objects, and how the robot completes object grasping through the learning-based approach when the traditional approach fails. The uncertain information is classified into geometric information and physical information. Based on the type of uncertain information, the object is further classified into three categories, which are geometric-uncertain objects, physical-uncertain objects, and unknown objects. Furthermore, the approaches to the feature sensing and robotic grasping of these objects are presented based on the varied characteristics of each type of object. Finally, we summarize the reviewed approaches for uncertain objects and provide some interesting issues to be more investigated in the future. It is found that the object's features, such as material and compactness, are difficult to be sensed, and the object grasping approach based on learning networks plays a more important role when the unknown degree of the task object increases.

Keywords: feature sensing; geometric uncertainty; physical uncertainty; robotic grasping; uncertain objects.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Pipeline describing feature sensing and robotic grasping.
Figure 2
Figure 2
Classification of objects with uncertain information.
Figure 3
Figure 3
Shape identification with completed 2D boundary [33].
Figure 4
Figure 4
Scenarios of objects’ 6D pose estimation [40]: (a) tabletop scenario, (b) bin-picking scenario.
Figure 5
Figure 5
The whole pipeline of the random forest approach [46].
Figure 6
Figure 6
Different structures of grasper: (a) sucker [48], (b) two-fingered grasper [49], (c) three-fingered grasper [50], (d) four-fingered grasper [51], (e) five-fingered grasper [51], (f) soft grasper [48].
Figure 7
Figure 7
Object grasping with a soft grasper [54].
Figure 8
Figure 8
The Scalable Tactile Glove (STAG) as a platform to learn from the human grasp [60].
Figure 9
Figure 9
Functional flowchart of object grasping by learning from demonstration (LfD).
Figure 10
Figure 10
Functional flowchart of object grasping based on the task.
Figure 11
Figure 11
Examples of task oriented-based grasping approaches: (a) the schematic diagram for generating a task-related grasp database [86], (b) Task-Oriented Grasping Network [88].
Figure 12
Figure 12
Object searching in a cluttered environment [108].
Figure 13
Figure 13
Approaches for pose estimation: (a) the DenseFusion architecture [113], (b) recognition based on the multi-object pose estimation and detection (MOPED) framework [115].
Figure 14
Figure 14
A priori experience-based grasping approaches: (a) the hierarchical controller architecture [122], (b) the active learning architecture [125].
Figure 15
Figure 15
Framework of point cloud processing [133].
Figure 16
Figure 16
A grasping approach based on sensors and learning networks [134].

References

    1. Hu L., Miao Y., Wu G., Hassan M.M., Humar I. iRobot-Factory: An intelligent robot factory based on cognitive manufacturing and edge computing. Future Gener. Comput. Syst. 2019;90:569–577. doi: 10.1016/j.future.2018.08.006. - DOI
    1. Bera A., Randhavane T., Manocha D. The Emotionally Intelligent Robot: Improving Socially-aware Human Prediction in Crowded Environments; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops; Long Beach, BC, Canada. 16–20 June 2019.
    1. Wang T.M., Tao Y., Liu H. Current researches and future development trend of intelligent robot: A review. Int. J. Autom. Comput. 2018;15:525–546. doi: 10.1007/s11633-018-1115-1. - DOI
    1. Thanh V.N., Vinh D.P., Nghi N.T. Restaurant Serving Robot with Double Line Sensors Following Approach; Proceedings of the 2019 IEEE International Conference on Mechatronics and Automation; Tianjin, China. 4–7 August 2019; pp. 235–239.
    1. Yamazaki K., Ueda R., Nozawa S., Kojima M., Okada K., Matsumoto K., Ishikawa M., Shimoyama I., Inaba M. Home-assistant robot for an aging society. Proc. IEEE. 2012;100:2429–2441. doi: 10.1109/JPROC.2012.2200563. - DOI