Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Jan 25;10(5):10.16910/jemr.10.5.6.
doi: 10.16910/jemr.10.5.6.

Uncertainty visualization of gaze estimation to support operator-controlled calibration

Affiliations

Uncertainty visualization of gaze estimation to support operator-controlled calibration

Almoctar Hassoumi et al. J Eye Mov Res. .

Abstract

In this paper, we investigate how visualization assets can support the qualitative evaluation of gaze estimation uncertainty. Although eye tracking data are commonly available, little has been done to visually investigate the uncertainty of recorded gaze information. This paper tries to fill this gap by using innovative uncertainty computation and visualization. Given a gaze processing pipeline, we estimate the location of this gaze position in the world camera. To do so we developed our own gaze data processing which give us access to every stage of the data transformation and thus the uncertainty computation. To validate our gaze estimation pipeline, we designed an experiment with 12 participants and showed that the correction methods we proposed reduced the Mean Angular Error by about 1.32 cm, aggregating all 12 participants' results. The Mean Angular Error is 0.25° (SD=0.15°) after correction of the estimated gaze. Next, to support the qualitative assessment of this data, we provide a map which codes the actual uncertainty in the user point of view.

Keywords: accuracy; accuracy improvement; eye movement; eye tracking; gaze estimation; head movement; smooth pursuit; uncertainty; usability.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Flowchart of the gaze estimation method.
Figure 2
Figure 2
the user interface of ELAN. The software supports multiple synchronized media sources and an arbitrary number of annotation tiers. Videos are blurred to protect participants.
Figure 3
Figure 3
The green point is the new estimated point. From among all marker points recorded during the calibration procedure, the closest one is selected.
Figure 4
Figure 4
The marker points (red) and the estimated marker points (blue) are plotted for the X and Y values separately. The yellow overlays encapsulate the values to be considered for the interpolation. The upper (resp. lower) green point is the Y (resp. X) value of the interpolated point and the upper (resp. lower) filled red point is its actual position with corrected error.
Figure 5
Figure 5
Inverse Distance Weighting impact area.
Figure 6
Figure 6
Modified Inverse Distance Weighting impact area.
Figure 7
Figure 7
In this image, one can see that the gaze estimation’s Mean Angular Error is reduced by the Inverse Distance Weighting and is significantly reduced by the Modified Inverse Distance Weighting.
Figure 8
Figure 8
A. Pupil center position. B. Pupil location uncertainty area from 0 (Red area) to 1 (Blue area).
Figure 9
Figure 9
A. Common 9-point Calibration method. B. Pursuit calibration with rectangular trajectory. C. Fixed marker and head movement calibration.
Figure 11
Figure 11
Comparison of uncertainty visualization using Gaussian kernel (A) and pupil polygon with distance transform (B) when following the circular trajectory. The right image in figure B shows the result in a different color space for more clarity.
Figure 10
Figure 10
In this image, we show our uncertainty visualization results. This image corresponds to the accumulation (density map) of the recorded pupil location uncertainty. We used a bump-mapping technique to emphasize the strong variations (gradient detection). This image shows strong inaccuracy on the left part of the image and a lack of records in the center of the image.
Figure 12
Figure 12
Uncertainty visualization using Gaussian kernel (A) and pupil-shape-dependent kernel (B) after performing “+” pattern calibration. The Gaussian kernel in (A) is circular, but the one used in (B) depends on the orientation, the size, and the shape of each pupil.
Figure 13
Figure 13
Uncertainty visualization of gaze estimation after performing 9-point calibration with two different eye camera positions (frontal and bottom) and two different pupil sizes (large and small). The size of the pupil changes with the varying lighting conditions.
Figure 14
Figure 14
Left, pupil contours detected in the pupil camera. Right the same pupil contours process into the world camera. The outlined red contours show significant deformation due to the calibration transfer function. Bottom figures are the corresponding density maps.
Figure 15
Figure 15
Recorded pupil location (in the pupil camera) on the left, corresponding target position on the right. Similarly, EyeRecToo proposes an innovative way to collect such calibration points using a marker on a mobile phone.
Figure 16
Figure 16
Visualization of the norm of the error between the computed gaze location and its actual position. Lower errors are dark in the left image and green in the right image.

References

    1. Blignaut P. (2013). A new mapping function to improve the accuracy of a video-based eye tracker. Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference on - SAICSIT 13. doi: 10.1145/2513456.2513461 - DOI
    1. Blignaut P., & Wium D. (2013). The effect of mapping function on the accuracy of a video-based eye tracker. Proceedings of the 2013 Conference on Eye Tracking South Africa - ETSA 13. doi: 10.1145/2509315.2509321 - DOI
    1. Celebi F. M., Kim E. S., Wang Q., Wall C. A., & Shic F. (2014). A smooth pursuit calibration technique. Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA 14. doi: 10.1145/2578153.2583042 - DOI
    1. Cerrolaza J. J., Villanueva A., & Cabeza R. (2008). Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems. Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA 08. doi: 10.1145/1344471.1344530 - DOI
    1. Cerrolaza J. J., Villanueva A., & Cabeza R. (2012). Study of Polynomial Mapping Functions in Video-Oculography Eye Trackers. ACM Transactions on Computer-Human Interaction, 19(2), 1–25. 10.1145/2240156.2240158 - DOI

LinkOut - more resources