Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Jul 23;6(8):73.
doi: 10.3390/jimaging6080073.

Hand Gesture Recognition Based on Computer Vision: A Review of Techniques

Affiliations
Review

Hand Gesture Recognition Based on Computer Vision: A Review of Techniques

Munir Oudah et al. J Imaging. .

Abstract

Hand gestures are a form of nonverbal communication that can be used in several fields such as communication between deaf-mute people, robot control, human-computer interaction (HCI), home automation and medical applications. Research papers based on hand gestures have adopted many different techniques, including those based on instrumented sensor technology and computer vision. In other words, the hand sign can be classified under many headings, such as posture and gesture, as well as dynamic and static, or a hybrid of the two. This paper focuses on a review of the literature on hand gesture techniques and introduces their merits and limitations under different circumstances. In addition, it tabulates the performance of these methods, focusing on computer vision techniques that deal with the similarity and difference points, technique of hand segmentation used, classification algorithms and drawbacks, number and types of gestures, dataset used, detection range (distance) and type of camera used. This paper is a thorough general overview of hand gesture methods with a brief discussion of some possible applications.

Keywords: computer vision; hand gesture; hand posture; human–computer interaction (HCI).

PubMed Disclaimer

Conflict of interest statement

The authors of this manuscript have no conflicts of interest relevant to this work.

Figures

Figure 1
Figure 1
Different techniques for hand gestures. (a) Glove-based attached sensor either connected to the computer or portable; (b) computer vision–based camera using a marked glove or just a naked hand.
Figure 2
Figure 2
Classifications method conducted by this review.
Figure 3
Figure 3
Sensor-based data glove (adapted from website: https://physicsworld.com/a/smart-glove-translates-sign-language-into-digital-text/).
Figure 4
Figure 4
Using computer vision techniques to identify gestures. Where the user perform specific gesture by single or both hand in front of camera which connect with system framework that involve different possible techniques to extract feature and classify hand gesture to be able control some possible application.
Figure 5
Figure 5
Color-based recognition using glove marker [13].
Figure 6
Figure 6
Example of skin color detection. (a) Apply threshold to the channels of YUV color space in order to extract only skin color then assign 1 value for the skin and 0 to non-skin color; (b) detected and tracked hand using resulted binary image.
Figure 7
Figure 7
Example on appearance recognition using foreground extraction in order to segment only ROI, where the object features can be extracted using different techniques such as pattern or image subtraction and foreground and background segmentation algorithms.
Figure 8
Figure 8
Example on motion recognition using frame difference subtraction to extract hand feature, where the moving object such as hand extracted from the fixed background.
Figure 9
Figure 9
Example of skeleton recognition using depth and skeleton dataset to representation hand skeleton model [62].
Figure 10
Figure 10
Depth-based recognition: (a) hand joint distance from camera; (b) different feature extraction using Kinect depth sensor.
Figure 11
Figure 11
3D hand model interaction with virtual system [83].
Figure 12
Figure 12
Simple example on deep learning convolutional neural network architecture.
Figure 13
Figure 13
Most common application area of hand gesture interaction system (the image of Figure 13 is adapted from [12,14,42,76,83,98,99]).

References

    1. Zhigang F. Computer gesture input and its application in human computer interaction. Mini Micro Syst. 1999;6:418–421.
    1. Mitra S., Acharya T. Gesture recognition: A survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2007;37:311–324. doi: 10.1109/TSMCC.2007.893280. - DOI
    1. Ahuja M.K., Singh A. Static vision based Hand Gesture recognition using principal component analysis; Proceedings of the 2015 IEEE 3rd International Conference on MOOCs, Innovation and Technology in Education (MITE); Amritsar, India. 1–2 October 2015; pp. 402–406.
    1. Kramer R.K., Majidi C., Sahai R., Wood R.J. Soft curvature sensors for joint angle proprioception; Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems; San Francisco, CA, USA. 25–30 September 2011; pp. 1919–1926.
    1. Jesperson E., Neuman M.R. A thin film strain gauge angular displacement sensor for measuring finger joint angles; Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society; New Orleans, LA, USA. 4–7 November 1988; pp. 807–vol..