Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Apr 14;17(4):862.
doi: 10.3390/s17040862.

Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker

Affiliations

Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker

Rizwan Ali Naqvi et al. Sensors (Basel). .

Abstract

Gaze-based interaction (GBI) techniques have been a popular subject of research in the last few decades. Among other applications, GBI can be used by persons with disabilities to perform everyday tasks, as a game interface, and can play a pivotal role in the human computer interface (HCI) field. While gaze tracking systems have shown high accuracy in GBI, detecting a user's gaze for target selection is a challenging problem that needs to be considered while using a gaze detection system. Past research has used the blinking of the eyes for this purpose as well as dwell time-based methods, but these techniques are either inconvenient for the user or requires a long time for target selection. Therefore, in this paper, we propose a method for fuzzy system-based target selection for near-infrared (NIR) camera-based gaze trackers. The results of experiments performed in addition to tests of the usability and on-screen keyboard use of the proposed method show that it is better than previous methods.

Keywords: GBI; NIR camera-based gaze tracker; fuzzy system; gazing at a target to select it.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Overall procedure of the proposed method.
Figure 2
Figure 2
Example of detecting pupil and glint of right eye. (a) Input face image; (b) Search area of right eye; (c) Binarized image; (d) Approximate pupil area detected by CHT, and eye ROI defined on the basis of approximate pupil area; (e) Pupil and glint boundaries detected by Chan–Vese algorithm with adaptive mask.
Figure 3
Figure 3
Flowchart for detecting glint center and pupil region.
Figure 4
Figure 4
Three template graphs for calculating template matching score.
Figure 5
Figure 5
Four images showing the detected centers of glint and the pupil, when a person is looking at the (a) upper-left; (b) upper-right; (c) lower-left; and (d) lower-right calibration positions on a monitor.
Figure 5
Figure 5
Four images showing the detected centers of glint and the pupil, when a person is looking at the (a) upper-left; (b) upper-right; (c) lower-left; and (d) lower-right calibration positions on a monitor.
Figure 6
Figure 6
Relationship between pupil movable region (the quadrangle defined by (Px0,Py0), (Px1,Py1), (Px2,Py2), and (Px3,Py3)) in the eye image and the monitor region (the quadrangle defined by (Mx0,My0), (Mx1,My1), (Mx2,My2), and (Mx3,My3)).
Figure 7
Figure 7
16 Gabor filters with four scales and four orientations.
Figure 8
Figure 8
ROI for applying Gabor filter based on the user’s gaze position.
Figure 9
Figure 9
Fuzzy method for detecting user’s gaze for target selection.
Figure 10
Figure 10
Designing optimal input fuzzy membership functions for features 1–3 based on maximum entropy criterion. (a) Membership functions for features 1 and 2; (b) Membership functions for features 1 and 3.
Figure 11
Figure 11
Definitions of output membership functions.
Figure 12
Figure 12
Finding the output value of the input membership function for three features: (a) feature 1; (b) feature 2; and (c) feature 3.
Figure 13
Figure 13
Obtaining output score values of fuzzy system using various defuzzification methods. (a) FOM, LOM, and MOM; (b) COG and BOA.
Figure 13
Figure 13
Obtaining output score values of fuzzy system using various defuzzification methods. (a) FOM, LOM, and MOM; (b) COG and BOA.
Figure 14
Figure 14
Experimental setup for proposed method. (a) Example of experimental environment; (b) Three screens of teddy bear (upper), bird (middle), and butterfly (lower) used in our experiments.
Figure 14
Figure 14
Experimental setup for proposed method. (a) Example of experimental environment; (b) Three screens of teddy bear (upper), bird (middle), and butterfly (lower) used in our experiments.
Figure 15
Figure 15
Comparative examples of detection of boundaries and centers of pupil and glint by our method, and the previous method and the ground truth. (a) Detected boundaries of pupil and glint in eye image; (b) Comparison of detection of boundaries of pupil; (c) Comparison of detection of center of pupil; (d) Comparison of detection of boundaries of glint; (e) Comparison of detection of center of glint.
Figure 15
Figure 15
Comparative examples of detection of boundaries and centers of pupil and glint by our method, and the previous method and the ground truth. (a) Detected boundaries of pupil and glint in eye image; (b) Comparison of detection of boundaries of pupil; (c) Comparison of detection of center of pupil; (d) Comparison of detection of boundaries of glint; (e) Comparison of detection of center of glint.
Figure 16
Figure 16
ROC curves of the classification results of TP and TN data according to different defuzzification methods with the MIN rule.
Figure 17
Figure 17
ROC curves of the classification results of TP and TN data according to different defuzzification methods with the MAX rule.
Figure 18
Figure 18
ROC curves of the classification results of TP and TN data according to different defuzzification methods with the MIN rule.
Figure 19
Figure 19
ROC curves of the classification results of TP and TN data according to different defuzzification methods with the MAX rule.
Figure 20
Figure 20
ROC curves of the classification results of TP and TN data according to different defuzzification methods with the MIN rule.
Figure 21
Figure 21
ROC curves of the classification results of TP and TN data according to different defuzzification methods with the MAX rule.
Figure 22
Figure 22
Comparison of results by proposed method with the results by “Method A” and previous method with experiments involving (a) bear; (b) bird; and (c) butterfly.
Figure 22
Figure 22
Comparison of results by proposed method with the results by “Method A” and previous method with experiments involving (a) bear; (b) bird; and (c) butterfly.
Figure 23
Figure 23
Effect of noise on results by proposed method with the results by “Method A” and previous method with experiments involving (a) bear; (b) bird; and (c) butterfly.
Figure 23
Figure 23
Effect of noise on results by proposed method with the results by “Method A” and previous method with experiments involving (a) bear; (b) bird; and (c) butterfly.
Figure 24
Figure 24
The comparative results of subjective tests of 15 users using the dwell time-based method and the proposed method.
Figure 25
Figure 25
Example of experiment where a user types words on our system.
Figure 26
Figure 26
The comparative results of proposed method with dwell time-based method using virtual keyboard in terms of accuracy and execution time. (a) Accuracy; (b) Average execution time for typing one character; (c) Average execution time for typing one word.
Figure 27
Figure 27
The comparative results of proposed method, with the dwell time-based method using virtual keyboard in terms of convenience and interest.

References

    1. Bolt R.A. Eyes at the Interface; Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; Gaithersburg, MD, USA. 15–17 March 1982; pp. 360–362.
    1. Mauri C., Granollers T., Lorés J., García M. Computer Vision Interaction for People with Severe Movement Restrictions. Interdiscip. J. Hum. ICT Environ. 2006;2:38–54. doi: 10.17011/ht/urn.2006158. - DOI
    1. Biosignal. [(accessed on 15 July 2016)]; Available online: https://en.wikipedia.org/wiki/Biosignal.
    1. Pinheiro C.G., Jr., Naves E.L.M., Pino P., Losson E., Andrade A.O., Bourhis G. Alternative Communication Systems for People with Severe Motor Disabilities: A Survey. Biomed. Eng. Online. 2011;10:1–28. doi: 10.1186/1475-925X-10-31. - DOI - PMC - PubMed
    1. Lin C.-S., Ho C.-W., Chen W.-C., Chiu C.-C., Yeh M.-S. Powered Wheelchair Controlled by Eye-Tracking System. Opt. Appl. 2006;36:401–412.

LinkOut - more resources