Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2019 Feb 1:12:73.
doi: 10.3389/fnsys.2018.00073. eCollection 2018.

Eye Movement Compensation and Spatial Updating in Visual Prosthetics: Mechanisms, Limitations and Future Directions

Affiliations
Review

Eye Movement Compensation and Spatial Updating in Visual Prosthetics: Mechanisms, Limitations and Future Directions

Nadia Paraskevoudi et al. Front Syst Neurosci. .

Abstract

Despite appearing automatic and effortless, perceiving the visual world is a highly complex process that depends on intact visual and oculomotor function. Understanding the mechanisms underlying spatial updating (i.e., gaze contingency) represents an important, yet unresolved issue in the fields of visual perception and cognitive neuroscience. Many questions regarding the processes involved in updating visual information as a function of the movements of the eyes are still open for research. Beyond its importance for basic research, gaze contingency represents a challenge for visual prosthetics as well. While most artificial vision studies acknowledge its importance in providing accurate visual percepts to the blind implanted patients, the majority of the current devices do not compensate for gaze position. To-date, artificial percepts to the blind population have been provided either by intraocular light-sensing circuitry or by using external cameras. While the former commonly accounts for gaze shifts, the latter requires the use of eye-tracking or similar technology in order to deliver percepts based on gaze position. Inspired by the need to overcome the hurdle of gaze contingency in artificial vision, we aim to provide a thorough overview of the research addressing the neural underpinnings of eye compensation, as well as its relevance in visual prosthetics. The present review outlines what is currently known about the mechanisms underlying spatial updating and reviews the attempts of current visual prosthetic devices to overcome the hurdle of gaze contingency. We discuss the limitations of the current devices and highlight the need to use eye-tracking methodology in order to introduce gaze-contingent information to visual prosthetics.

Keywords: artificial vision; blindness; eye movements; gaze contingency; neuroprosthetics; visual prosthetics.

PubMed Disclaimer

Figures

FIGURE 1
FIGURE 1
The double-step saccade task used to illustrate spatial updating. Subjects fixate a centrally presented stimulus (FP) and subsequently they are asked to fixate two successively and briefly presented stimuli at different screen locations (T1, T2). The perceptual distance between the fovea and each target at the time of stimulus presentation is called the retinal error (“error” because the value must be brought to zero in order to achieve the goal of foveation). The eye movements required to correctly foveate each target are in turn called the motor errors. To make a saccade to the first target T1, the motor error ME1 can be deduced directly from the retinal error RE1 for T1 and therefore correctly executed. However, the first saccade to T1 displaces T2 from the location where T2 initially appeared on the retina. Thus, executing a second saccade based purely on the originally observed retinal error RE2 would lead to a failed attempt to foveate T2 (orange dashed line). Instead, the motor plan ME2 for T2 needs to compensate for the intervening saccade to T1. This is accomplished by subtracting RE1 from RE2. Recall that both T1 and T2 are only briefly presented, and are extinguished prior to the execution of eye movements. (Adapted from Mays and Sparks, 1980 and Klier and Angelaki, 2008).
FIGURE 2
FIGURE 2
The double-step saccade task used by Sparks and Mays (1983). After fixating a stimulus presented at the center of the screen (FP), the monkey was trained to generate an eye movement (S1) to the location of the target (FP→TP). However, in some trials and after the target’s offset, the animal received a train of electrical stimulation to the superior colliculus, before initiating the eye movement. This stimulation drove the animal’s eyes away from the fixation target (S2) to another position (FP→TP′). To generate an accurate saccade (S3) to the remembered target location (TP′→TP), the animal had to take into account the amplitude and direction of the stimulation-induced intervening eye movement. This study showed that the electrically induced saccade (S2) was followed by a saccade (S3) toward the location of the target stimulus (TP), which allowed the animal to correctly localize the target (TP). Saccades that did not take into account the electrically induced perturbation (dashed line) were not observed, demonstrating that the perception of TP occurred in spatial coordinates that are deduced from a combination of retinal activity and eye position. (Adapted from Sparks and Mays, 1983).
FIGURE 3
FIGURE 3
The predictive-remapping task from Duhamel et al.’s study (1992). (A) The subject fixated a central point (FP), placing the RF of the cell under study on a blank part of the screen (RF before). Then, a target point (TP) to which the animal was required to make a saccade was presented simulatenously with a peripheral visual stimulus (Stimulus) in the future, post-saccade, RF location of the neuron (dashed circle). (B) During initial fixation, there is no neural response (blue histogram). However, slightly preceding the initiation of the saccade (solid red line), the cell begins to fire (gray background). Saccades typically completed in 30 ms (dashed red line), placing the classical RF over the Stimulus. With normal latency, the response would be expected to start 75 ms later (dashed green line), but the cell has continued to respond in the meanwhile (gray hatched background). We expect that, as the animal shifs its gaze to the locus of the target, the RF shifts as well, and the cell would begin to fire after the normal response latency following the saccade completion. However, portions of the discharge of the cell not only preceded that expected latency (gray hatched background), but also preceded the saccade (gray area), suggesting that the location of the RF shifted to accurately anticipate the position after the eye movement. (Data extracted from Duhamel et al., 1992, especially figure 2b).
FIGURE 4
FIGURE 4
The navigational tasks employed by Garcia et al. (2015) as seen from above. Patients implanted with Argus II were initially guided along a path by the experimenter (solid line). The path comprised of an initial 2.5-meter leg, a rotation left, and a final 2-meter second leg. A lamp, acting as a visual landmark was placed midway along the second leg. For the path reproduction task (Task 1), participants were led to the start position (filled circle) and were asked to reproduce the path as accurately as possible (dashed line). For the triangle completion task (Task 2), participants started from the end of the reproduction path (open circle) and had to return directly to the initial start position (filled circle), thereby completing a walked triangle path (dotted line). (Diagram adapted from Garcia et al., 2015, Figure 2).

Similar articles

Cited by

References

    1. Ahuja A. K., Dorn J. D., Caspi A., McMahon M. J., Dagnelie G., daCruz L., et al. (2011). Blind subjects implanted with the Argus II retinal prosthesis are able to improve performance in a spatial-motor task. ıBr. J. Ophthalmol. 95 539–543. 10.1136/bjo.2010.179622 - DOI - PMC - PubMed
    1. Andersen R. A., Bracewell R. M., Barash S., Gnadt J. W., Fogassi L. (1990). Eye position effects on visual, memory, and saccade-related activity in areas LIP and 7a of macaque. J. Neurosci. 10 1176–1196. 10.1523/JNEUROSCI.10-04-01176.1990 - DOI - PMC - PubMed
    1. Andersen R. A., Essick G. K., Siegel R. M. (1985). Encoding of spatial location by posterior parietal neurons. Science 230 456–458. 10.1126/science.4048942 - DOI - PubMed
    1. Andersen R. A., Gnadt J. W. (1989). “Posterior parietal cortex,” in The Neurobiology of Saccadic Eye Movements, Reviews of Oculomotor Research, Vol. III, eds Wurtz R. H., Goldberg M. E. (Amsterdam: Elsevier; ), 315–336. - PubMed
    1. Andersen R. A., Mountcastle V. B. (1983). The influence of the angle of gaze upon the excitability of the light-sensitive neurons of the posterior parietal cortex. J. Neurosci. 3 532–548. 10.1523/JNEUROSCI.03-03-00532.1983 - DOI - PMC - PubMed

LinkOut - more resources