Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Jan;56(1):80-92.
doi: 10.3758/s13428-022-01938-w. Epub 2022 Aug 10.

A low-cost robotic oculomotor simulator for assessing eye tracking accuracy in health and disease

Affiliations

A low-cost robotic oculomotor simulator for assessing eye tracking accuracy in health and disease

Al Lotze et al. Behav Res Methods. 2024 Jan.

Abstract

Eye tracking accuracy is affected in individuals with vision and oculomotor deficits, impeding our ability to answer important scientific and clinical questions about these disorders. It is difficult to disambiguate decreases in eye movement accuracy and changes in accuracy of the eye tracking itself. We propose the EyeRobot-a low-cost, robotic oculomotor simulator capable of emulating healthy and compromised eye movements to provide ground truth assessment of eye tracker performance, and how different aspects of oculomotor deficits might affect tracking accuracy and performance. The device can operate with eccentric optical axes or large deviations between the eyes, as well as simulate oculomotor pathologies, such as large fixational instabilities. We find that our design can provide accurate eye movements for both central and eccentric viewing conditions, which can be tracked by using a head-mounted eye tracker, Pupil Core. As proof of concept, we examine the effects of eccentric fixation on calibration accuracy and find that Pupil Core's existing eye tracking algorithm is robust to large fixation offsets. In addition, we demonstrate that the EyeRobot can simulate realistic eye movements like saccades and smooth pursuit that can be tracked using video-based eye tracking. These tests suggest that the EyeRobot, an easy to build and flexible tool, can aid with eye tracking validation and future algorithm development in healthy and compromised vision.

Keywords: Eye robot; Eye tracker; Ground truth eye movements.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Custom-manufactured EyeRobot consists of two metal eye sockets supporting 3D-printed eyeballs attached to two motors each (one for the generation of vertical movement and one for horizontal movement). The sockets are mounted to a wooden frame that also has a scene camera mount above the eye assembly, and two adjustable eye camera mounts facing the eyes.
Figure 2.
Figure 2.
Schematic of the EyeRobot eye assembly. For clarity only the left eye is shown. Right eye is the mirror replica of the left. A. Front view of the EyeRobot eye assembly with parts and dimensions labeled. B. Side (medial) view of the eye assembly.
Figure 3.
Figure 3.
The 3D printed EyeRobot Design. A & B Schematics of the device and device prototype (C). Servo motors are marked in green in the schematics. The eyeballs are attached to the frame using snap-fit pivots.
Figure 4.
Figure 4.
EyeRobot accuracy and precision testing tasks. A. Incremental step task: the EyeRobot eye was moved from the origin (0,0) in incrementally (by 2°) increasing steps in the horizontal (purple) or vertical (brown) direction. B. Star task: the EyeRobot eyes were moved to one of 8 locations at 15° (large star, blue dots) or 3° (small, yellow dots). Example trajectories are shown with dashed arrows. All movements were initiated from the 0,0 position.
Figure 5.
Figure 5.
Eye and scene camera placement on the EyeRobot. Scene camera is mounted above the eyes on a metal beam. The eye cameras are mounted on articulated arms in front of and below the eyes.
Figure 6.
Figure 6.
Calibration marker and grid used for eye tracker calibration and validation. Bullseye targets used during the experiment are shown at target locations. Red dots represent gaze locations for the corresponding target for central (left), 5° eccentric (center), and 11°x12° eccentric (right) calibrations. Example target-gaze correspondences are shown with blue arrows.
Figure 7.
Figure 7.
EyeRobot accuracy and precision for horizontal and vertical eye rotations of 2°, 4° 6° 8°, 10° and 12°. Middle panel shows eye landing locations for each target location, for each motor (horizontal and vertical, designated by color and shape in legend). Points in the middle panel have been offset by 0.2° in x and y for visibility. Side panels show mean eye position error at each location marked in the middle panel for each rotation direction/motor (filled bars: horizontal, open bars: vertical; error bars are SEM); left panel: left eye, right panel: right eye.
Figure 8.
Figure 8.
EyeRobot performance on the Star task. A. EyeRobot combined eye landing positions in the scene camera for the 15° (red) and 3° (green) Star tasks. Each green and red symbol represents a single landing of the EyeRobot gaze on the reference location (corresponding grey dot). B. EyeRobot gaze position errors in the horizontal and vertical directions across target locations for the 15° task (error bars: SEM). The red dots on the x-axis represent the corresponding target locations where the gaze position error was calculated.
Figure 9.
Figure 9.
Monocular eye tracker calibration with the EyeRobot eye with different ocular eccentricities. The accuracy and precision values listed are those reported by the Pupil Labs software after validation. Top Row: Eye ball positions for each calibration grid target fixation (laser position with or without offset). Green circle: the eye model fit; red dot: detected pupil center, red circle: pupil ellipse used to determine gaze direction (Swirski and Dodgson 2013). Bottom Row: Composite images of estimated gaze locations in the scene camera image. Yellow circle with turquoise center dot: the point of regard measured by the eye tracker. Laser light is marked on the calibration grid as a red dot (the raw versions of the calibration grid images are available in Supplementary Information, Figure C for reference). A. No offset calibration: the optical axis of the eyeball is aligned on the target. B. 5° rightward offset of the optical (red laser dot) axis. C. Large offset (11° right horizontal and 12° up vertical). The laser is aligned with the cross at the top of the reversed L-shaped paper target, with the marker placed on the bottom left edge illustrating the offset. The visible variability in the orientation of the paper target at each position likely added slight variability to the bias and may thus have led to a less accurate measurement.
Figure 10.
Figure 10.
Raw eye tracking model (A & B) and data for 3° saccades in 8 different directions (C) and horizontal velocity ramp (D). A. Model of the tracked eyeballs in the corresponding eye camera’s coordinates (red, blue, green axes at below the corresponding eyes) with the gaze direction vector (pupil normal) marked (left eye in blue, right eye in red). B. Projection of the pupil normal unit vector (orange) onto the scene camera coordinate frame (x: blue, y: green, z: red). The tracker software estimates the direction of the pupil normal unit vector (gaze direction), which is then separated into the x, y, z components shown in C & D where the graphs’ y-axes are unitless showing the frame-by-frame projection of the pupil normal unit vector. C. Pupil Normal vector projection time series for right and left eye of the EyeRobot in the normalized eye camera coordinates. Saccade trajectories are marked in the legend at the top and order is marked above the traces. D. Vector time series for the right and left eye of the EyeRobot performing constant velocity ramps (smooth pursuit). Note the similarity of each cycle across the whole time series. Rows: x, y, and z axes. Note: motion is evident for all three axes due to the orientation of the eye cameras relative to the eye in 3D space.

References

    1. Agaoglu S, Agaoglu MN, & Das VE (2015). Motion Information via the Nonfixating Eye Can Drive Optokinetic Nystagmus in Strabismus. Investigative Ophthalmology & Visual Science, 56(11), 6423–6432. 10.1167/iovs.15-16923 - DOI - PMC - PubMed
    1. Basmak H, Sahin A, Yildirim N, Papakostas TD, & Kanellopoulos AJ (2007). Measurement of angle kappa with synoptophore and Orbscan II in a normal population. Journal of refractive surgery (Thorofare, N.J. : 1995), 23(5), 456–60. - PubMed
    1. Bekerman I, Gottlieb P, & Vaiman M (2014). Variations in Eyeball Diameters of the Healthy Adults. Journal of Ophthalmology, 2014, 503645. 10.1155/2014/503645 - DOI - PMC - PubMed
    1. Crossland MD, Culham LE, Kabanarou SA, & Rubin GS (2005). Preferred retinal locus development in patients with macular disease. Ophthalmology, 112(9), 1579–1585. 10.1016/j.ophtha.2005.03.027 - DOI - PubMed
    1. Economides JR, Adams DL, & Horton JC (2015). Variability of Ocular Deviation in Strabismus. JAMA Ophthalmology, 134(1), 1–8. 10.1001/jamaophthalmol.2015.4486 - DOI - PMC - PubMed

Publication types

LinkOut - more resources