Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2025 Mar 8;25(6):1687.
doi: 10.3390/s25061687.

Hand Gesture Recognition on Edge Devices: Sensor Technologies, Algorithms, and Processing Hardware

Affiliations
Review

Hand Gesture Recognition on Edge Devices: Sensor Technologies, Algorithms, and Processing Hardware

Elfi Fertl et al. Sensors (Basel). .

Abstract

Hand gesture recognition (HGR) is a convenient and natural form of human-computer interaction. It is suitable for various applications. Much research has already focused on wearable device-based HGR. By contrast, this paper gives an overview focused on device-free HGR. That means we evaluate HGR systems that do not require the user to wear something like a data glove or hold a device. HGR systems are explored regarding technology, hardware, and algorithms. The interconnectedness of timing and power requirements with hardware, pre-processing algorithm, classification, and technology and how they permit more or less granularity, accuracy, and number of gestures is clearly demonstrated. Sensor modalities evaluated are WIFI, vision, radar, mobile networks, and ultrasound. The pre-processing technologies stereo vision, multiple-input multiple-output (MIMO), spectrogram, phased array, range-doppler-map, range-angle-map, doppler-angle-map, and multilateration are explored. Classification approaches with and without ML are studied. Among those with ML, assessed algorithms range from simple tree structures to transformers. All applications are evaluated taking into account their level of integration. This encompasses determining whether the application presented is suitable for edge integration, their real-time capability, whether continuous learning is implemented, which robustness was achieved, whether ML is applied, and the accuracy level. Our survey aims to provide a thorough understanding of the current state of the art in device-free HGR on edge devices and in general. Finally, on the basis of present-day challenges and opportunities in this field, we outline which further research we suggest for HGR improvement. Our goal is to promote the development of efficient and accurate gesture recognition systems.

Keywords: 4G; 5G; AI accelerators; LTE; WiFi; algorithms; artificial intelligence; edge machine learning; hand gesture recognition; image processing; lidar; radar; signal processing; ultrasound; vision.

PubMed Disclaimer

Conflict of interest statement

Authors Elfi Fertl and Georg Stettinger were employed by the company Infineon Technologies AG. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
Applications of gesture recognition and similar systems.
Figure 2
Figure 2
Technology-agnostic processing flow.
Figure 3
Figure 3
Synergy of requirements, technology, algorithms for HGR systems.
Figure 4
Figure 4
Actuation with a single frequency.
Figure 5
Figure 5
Actuation with a chirp.
Figure 6
Figure 6
Actuation for UWB.
Figure 7
Figure 7
Spectrogram (in radar often called micro-doppler), calculated by 1D Fourier transform in fast-time direction, over the entire sample in a frequency band around the sent single frequency.
Figure 8
Figure 8
4 ultrasound sender at λ2 spacing, 24.5 kHz frequency, 0° steering.
Figure 9
Figure 9
4 ultrasound sender at λ2 spacing, 24.5 kHz frequency, 45° steering.
Figure 10
Figure 10
8 ultrasound sender at λ2 spacing, 24.5 kHz frequency, 0° steering.
Figure 11
Figure 11
8 ultrasound sender at λ2 spacing, 24.5 kHz frequency, 45° steering.
Figure 12
Figure 12
Raw ultrasound data sample in 3D.
Figure 13
Figure 13
Raw ultrasound data sample in 2D only segmented part plotted.
Figure 14
Figure 14
Raw ultrasound data sample in 2D.
Figure 15
Figure 15
Range-doppler/velocity map, calculated by 2D Fourier transform (in fast - first and then in slow-time direction), of the segmented gesture.
Figure 16
Figure 16
HGR classification algorithms and processes.

Similar articles

Cited by

References

    1. Darrell T., Pentland A. Space-time gestures; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; New York, NY, USA. 15–17 June 1993; pp. 335–340. - DOI
    1. Ren Z., Meng J., Yuan J. Depth camera based hand gesture recognition and its applications in Human-Computer-Interaction; Proceedings of the 2011 8th International Conference on Information, Communications & Signal Processing; Singapore. 13–16 December 2011; pp. 1–5. - DOI
    1. Liu J., Liu H., Chen Y., Wang Y., Wang C. Wireless Sensing for Human Activity: A Survey. IEEE Commun. Surv. Tutor. 2020;22:1629–1645. doi: 10.1109/COMST.2019.2934489. - DOI
    1. Pan M., Tang Y., Li H. State-of-the-Art in Data Gloves: A Review of Hardware, Algorithms, and Applications. IEEE Trans. Instrum. Meas. 2023;72:4002515. doi: 10.1109/TIM.2023.3243614. - DOI
    1. Tang Y., Pan M., Li H., Cao X. A Convolutional-Transformer-Based Approach for Dynamic Gesture Recognition of Data Gloves. IEEE Trans. Instrum. Meas. 2024;73:2518813. doi: 10.1109/TIM.2024.3400361. - DOI

LinkOut - more resources