Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Mar 30;20(7):1917.
doi: 10.3390/s20071917.

Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices

Affiliations

Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices

Ko-Feng Lee et al. Sensors (Basel). .

Abstract

In this study, a head-mounted device was developed to track the gaze of the eyes and estimate the gaze point on the user's visual plane. To provide a cost-effective vision tracking solution, this head-mounted device is combined with a sized endoscope camera, infrared light, and mobile phone; the devices are also implemented via 3D printing to reduce costs. Based on the proposed image pre-processing techniques, the system can efficiently extract and estimate the pupil ellipse from the camera module. A 3D eye model was also developed to effectively locate eye gaze points from extracted eye images. In the experimental results, average accuracy, precision, and recall rates of the proposed system can achieve an average of over 97%, which can demonstrate the efficiency of the proposed system. This study can be widely used in the Internet of Things, virtual reality, assistive devices, and human-computer interaction applications.

Keywords: eye tracking; gaze estimation; head-mounted; mobile devices; wearable devices.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Invasive eye tracker [11].
Figure 2
Figure 2
The proposed camera modules.
Figure 3
Figure 3
Three-dimensional diagram of the camera module housing.
Figure 4
Figure 4
Camera module housing.
Figure 5
Figure 5
The head-mounted device.
Figure 6
Figure 6
Comfortable space for the eye in the head-mounted device.
Figure 7
Figure 7
Mobile in the head-mounted device.
Figure 8
Figure 8
System architecture diagram [10].
Figure 9
Figure 9
Image preprocessing flowchart.
Figure 10
Figure 10
Results after setting USB video device class (UVC) parameters.
Figure 11
Figure 11
Use of the checkerboard image to correct the camera.
Figure 12
Figure 12
Results of distortion correction.
Figure 13
Figure 13
Results of bilateral filtering.
Figure 14
Figure 14
Results after using mean shift filtering.
Figure 15
Figure 15
Results of the color balancing.
Figure 16
Figure 16
Image after applying mathematical morphological process.
Figure 17
Figure 17
Binarization image representation using K-means [24,25].
Figure 18
Figure 18
Result using dark group as the threshold.
Figure 19
Figure 19
Result after erosion.
Figure 20
Figure 20
Result of filling the image with flood fill.
Figure 21
Figure 21
Result of subtracting the image before and after filling.
Figure 22
Figure 22
Superimposed mask image.
Figure 23
Figure 23
Image using canny edge detector.
Figure 24
Figure 24
Filling the image in the lower part of the image.
Figure 25
Figure 25
Automatically removed from the background the second time.
Figure 26
Figure 26
Image of connected pupil block.
Figure 27
Figure 27
Flowchart of capturing pupil images [10].
Figure 28
Figure 28
Region of interest (ROI) of the image.
Figure 29
Figure 29
Image Binarization.
Figure 30
Figure 30
Image using the flood-filling method.
Figure 31
Figure 31
Anti-whited low-threshold image.
Figure 32
Figure 32
Multithreshold pupil image.
Figure 33
Figure 33
Flowchart for fitting the pupil ellipse [10].
Figure 34
Figure 34
Pupil image use multithresholds.
Figure 35
Figure 35
Maximum area contour.
Figure 36
Figure 36
Original image.
Figure 37
Figure 37
Reflective highlight contour.
Figure 38
Figure 38
Coincident contour vertex image.
Figure 39
Figure 39
Schematic of optimized pupil vertices.
Figure 40
Figure 40
Pupil elliptical vertex image.
Figure 41
Figure 41
Fitting pupil elliptical image.
Figure 42
Figure 42
The pupil elliptical detection results with the proposed preprocessing techniques.
Figure 43
Figure 43
Pupil elliptical reflection projection [9].
Figure 44
Figure 44
Eyeball model.
Figure 45
Figure 45
Center of the sphere intersects the gaze point vector [9].
Figure 46
Figure 46
Line of sight vector versus screen coordinates.
Figure 47
Figure 47
Flowchart of the system.
Figure 48
Figure 48
Integrated calibration diagram.
Figure 49
Figure 49
Gaze point sequence.
Figure 50
Figure 50
Error angle diagram.
Figure 51
Figure 51
Infrared light image of the eye.
Figure 52
Figure 52
Eyelashes block the pupil.
Figure 53
Figure 53
Too many dark pixels in the eye socket.
Figure 54
Figure 54
Dark pixels of the pupil are too less or unstable shape of ellipse.

References

    1. Wang H., Liu Y.-H., Chen W. Visual tracking of robots in uncalibrated environments. Mechatronics. 2012;22:390–397. doi: 10.1016/j.mechatronics.2011.09.006. - DOI
    1. Andaluz V.H., Carelli R., Salinas L., Toibero J.M., Roberti F. Visual control with adaptive dynamical compensation for 3D target tracking by mobile manipulators. Mechatronics. 2012;22:491–502. doi: 10.1016/j.mechatronics.2011.09.013. - DOI
    1. Research-Eye Tracker and Brainwave Operation Sprout in the Market of the Frozen People, DIGITIME. [(accessed on 23 August 2017)];2017 Available online: http://www.digitimes.com.tw/tech/rpt/rpt_show.asp?cnlid=3&pro=y&proname=....
    1. What Role does Eye Tracking Technology Play in VR? TechNews. [(accessed on 23 August 2017)];2017 Available online: https://technews.tw/2016/01/14/eye-head-coordination-for-visual-cognitiv...
    1. Lin J.J.H., Lin S.S.J. Integrating eye trackers with handwriting tablets to discover difficulties of solving geometry problems. Br. J. Educ. Technol. 2016;49:17–29. doi: 10.1111/bjet.12517. - DOI

LinkOut - more resources