Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Apr 12:(206):10.3791/66779.
doi: 10.3791/66779.

PyOKR: A Semi-Automated Method for Quantifying Optokinetic Reflex Tracking Ability

Affiliations

PyOKR: A Semi-Automated Method for Quantifying Optokinetic Reflex Tracking Ability

James K Kiraly et al. J Vis Exp. .

Abstract

The study of behavioral responses to visual stimuli is a key component of understanding visual system function. One notable response is the optokinetic reflex (OKR), a highly conserved innate behavior necessary for image stabilization on the retina. The OKR provides a robust readout of image tracking ability and has been extensively studied to understand visual system circuitry and function in animals from different genetic backgrounds. The OKR consists of two phases: a slow tracking phase as the eye follows a stimulus to the edge of the visual plane and a compensatory fast phase saccade that resets the position of the eye in the orbit. Previous methods of tracking gain quantification, although reliable, are labor intensive and can be subjective or arbitrarily derived. To obtain more rapid and reproducible quantification of eye tracking ability, we have developed a novel semi-automated analysis program, PyOKR, that allows for quantification of two-dimensional eye tracking motion in response to any directional stimulus, in addition to being adaptable to any type of video-oculography equipment. This method provides automated filtering, selection of slow tracking phases, modeling of vertical and horizontal eye vectors, quantification of eye movement gains relative to stimulus speed, and organization of resultant data into a usable spreadsheet for statistical and graphical comparisons. This quantitative and streamlined analysis pipeline, readily accessible via PyPI import, provides a fast and direct measurement of OKR responses, thereby facilitating the study of visual behavioral responses.

PubMed Disclaimer

Conflict of interest statement

Disclosures

The authors have no conflicts of interest.

Figures

Figure 1:
Figure 1:. Collection of OKR response data.
(A) OKR virtual arena apparatus for behavioral stimulation, as previously described, . Four monitors surround a head-fixed animal (1), displaying a continuously moving checkerboard stimulus (2). The virtual drum can present unidirectional movement in all four cardinal directions as well as oscillatory sinusoidal stimuli. The mouse’s left eye is illuminated by an infrared (IR) light and recorded with a camera (3) to record visual system responses reflected in eye tracking. (B) Analysis of eye tracking occurs by capturing the pupil and a corneal reflection generated by IR light. Data collection and calculation of eye movements in response to the virtual drum were performed as described previously, . (C) Schematic of eye vectors moving vertically (Y wave) and horizontally (X wave). (D) Sample traces of an eye’s tracking response to unidirectional upwards and backwards motion, and also vertical and horizontal sinusoidal motion. Please click here to view a larger version of this figure.
Figure 2:
Figure 2:. Tracking analysis of unidirectional visual responses.
(A-D) Identification and selection of slow tracking phases for gain analysis. Sample unidirectional traces are shown with visual responses to forward (A), backward (B), upward (C), and downward (D) motion in relation to the mouse’s eye. Slow phases are identified by the addition of red and green points described in Step 3 to remove saccades, and the selected slow phases are highlighted in yellow. Polynomial regressions are overlaid on the traces as lines. (E) Quantification of the sample traces (A-D) as organized in the PyOKR readout. For each trace, total XY speeds and respective gains are calculated, regardless of directionality. In unidirectional responses, these total speeds will usually reflect the individual velocity in a certain direction; however, for sinusoidal responses, this value will reflect the average overall speed of the eye. Horizontal and vertical velocity components are broken down to show velocity in each respective direction. Gain is then calculated based on the presented stimulus velocities. (F) Calculated tracking gains of wild-type animals (n = 13) in the four cardinal directions compared to their associated ETM quantification. Data are presented as mean ± SD. Data analyzed with a one-way ANOVA with multiple comparisons. *p<0.05, **p<0.01,***p<0.005, ****p<0.0001. Please click here to view a larger version of this figure.
Figure 3:
Figure 3:. Automatic filtering of saccades facilitates OKR data processing and analysis.
(A-D) Automatic filtering of traces from Figure 2A–D removes saccades and models only slow phase motion by removing rapid velocity changes and stitching slow phases together. The final slope represents the total eye movement over the given epoch. (E) Quantification of gains from filtered sample data, as organized in the PyOKR readout. (F) Comparison of gain values between unfiltered vs. filtered sample eye traces reflects no significant differences. Data are presented as mean ± SD. Data analyzed with a Mann-Whitney U test between unfiltered and filtered results. Please click here to view a larger version of this figure.
Figure 4:
Figure 4:. Derivation of tracking gains in response to oscillatory visual stimuli.
(A,B) Vertical (A) and horizontal (B) eye movement responses to sinusoidally moving stimuli can be modeled relative to defined oscillatory stimulus parameters. Selected regions are labeled in yellow with the polynomial approximation overlaid on top of the trace. A model of the stimulus is presented as an orange sinusoid wave behind the trace to allow for reference to what the stimulus is at each point. (C) Gain calculations of wild-type sinusoidal responses (n = 7) reflect asymmetrical responses between horizontal and vertical tracking ability. Data are presented as mean ± SD. Data analyzed with a one-way ANOVA with multiple comparisons. **p<0.01,***p<0.005. Please click here to view a larger version of this figure.
Figure 5:
Figure 5:. Directional tracking can be modeled in its horizontal and vertical components.
(A) Vertical component of an eye tracking wave in response to an upward stimulus. (B) Horizontal component of an eye tracking wave in response to an upward stimulus. (C) Overall eye trajectory in both vertical and horizontal directions. (D) Three-dimensional model of the eye’s movement vector over time in response to downward motion. Raw trace data is displayed in red and the regression model of trajectory is displayed in blue. Please click here to view a larger version of this figure.
Figure 6:
Figure 6:. Analysis of the OKR in Tbx5f/f; Pcdh9-Cre mice shows significant deficits in unidirectional vertical tracking gains.
(A) Tbx5f/f; Pcdh9-Cre animals show no significant change in horizontal tracking gain. (B,C) Tbx5f/f; Pcdh9-Cre animals show a significant reduction of gain in their vertical responses: upwards (B) and downwards (C). (D,E) Sinusoidal responses of Tbx5f/f; Pcdh9-Cre animals in response to horizontal (D) and vertical (E) oscillatory stimuli. (F) Quantification of Tbx5f/f; Pcdh9-Cre oscillatory responses show significant increases in horizontal tracking gains, but show decreases in vertical responses. Data are presented as mean ± SD. Data analyzed with Mann-Whitney U tests. *p<0.05, **p<0.01, ****p<0.0001. Please click here to view a larger version of this figure.
Figure 7:
Figure 7:. Application of PyOKR to data acquired from alternative video-oculography methods.
(A) Apparatus for OKR virtual drum stimulation, as described. A 405 nm wavelength DLP projector is reflected via a convex mirror onto a hemisphere to create a virtual drum that surrounds the animal’s field of view. Eye movements are measured using an NIR camera positioned outside of the hemisphere. Unidirectional and sinusoidal bar gratings are shown to a head-fixed animal in vertical directions. (B,C) Upward (B) and downward (C) tracking phases are identified and selected for quantitative analysis. Slow phases are highlighted in yellow. (D) Tracking gains calculated from vertical tracking of wild-type animals (n=5) using methods described here. Asymmetric tracking ability is observed, with a significant decrease in downward tracking. (E) Oscillatory response to sinusoidal stimuli modeled to quantify tracking gains in wild-type animals (n=8). Slow phases are highlighted in yellow. (F) Quantification of sinusoidal gains reveals decreased downward tracking gains compared to upward gains. Data are presented as mean ± SD. Data analyzed with Mann-Whitney U tests. *p<0.05. Please click here to view a larger version of this figure.

Update of

Similar articles

References

    1. Stahl JS Using eye movements to assess brain function in mice. Vision Res. 44 (28), 3401–3410 (2004). - PubMed
    1. Kretschmer F, Tariq M, Chatila W, Wu B, Badea TC Comparison of optomotor and optokinetic reflexes in mice. J Neurophysiol. 118, 300–316 (2017). - PMC - PubMed
    1. Bronstein AM, Patel M, Arshad Q A brief review of the clinical anatomy of the vestibular-ocular connections - How much do we know? Eye. 29 (2), 163–170 (2015). - PMC - PubMed
    1. Simpson JI The accessory optic system. Ann Rev Neurosci. 7, 13–41 (1984). - PubMed
    1. Hamilton NR, Scasny AJ, Kolodkin AL Development of the vertebrate retinal direction-selective circuit. Dev Biol. 477, 273–283 (2021). - PMC - PubMed

Publication types

LinkOut - more resources