Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Dec 12;28(1):111584.
doi: 10.1016/j.isci.2024.111584. eCollection 2025 Jan 17.

Microsaccade selectivity as discriminative feature for object decoding

Affiliations

Microsaccade selectivity as discriminative feature for object decoding

Salar Nouri et al. iScience. .

Abstract

Microsaccades, a form of fixational eye movements, help maintain visual stability during stationary observations. This study examines the modulation of microsaccadic rates by various stimulus categories in monkeys and humans during a passive viewing task. Stimulus sets were grouped into four primary categories: human, animal, natural, and man-made. Distinct post-stimulus microsaccade patterns were identified across these categories, enabling successful decoding of the stimulus category with accuracy and recall of up to 85%. We observed that microsaccade rates are independent of pupil size changes. Neural data showed that category classification in the inferior temporal (IT) cortex peaks earlier than changes in microsaccade rates, suggesting feedback from the IT cortex influences eye movements after stimulus discrimination. These results contribute to neurobiological models, enhance human-machine interfaces, optimize experimental visual stimuli, and deepen understanding of microsaccades' role in object decoding.

Keywords: Artificial intelligence; Computer science; Human-computer interaction.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

None
Graphical abstract
Figure 1
Figure 1
Experimental paradigm and eye movements (A) A sample of the stimulus set: The stimulus set utilized in our experiment comprises grayscale images, including four main categories: humans, animals, man-made objects, and natural scenes. Human and animal stimuli are further categorized into faces and bodies, forming the animate category, while man-made and natural entities create the inanimate category. (B) Experimental task paradigm: Each trial involves participants focusing on a sequence of images while their eye movements are recorded. Employing the RSVP technique, every trial displayed 155 randomly chosen stimuli across various categories. The task sequence was initiated with a blank screen, succeeded by the presentation of a fixation point lasting 500 ms. Subsequently, each stimulus appeared for a duration of 50 ms. The participants’ eye positions were continually tracked and recorded throughout the trials. (C) Distribution of saccades and amplitudes over time: The aggregated distribution of saccade rates and amplitudes is presented across time bins of 15 ms. The amplitude axis is represented using a logarithmic scale with a log amplitude 0.05. Time zero marks the stimulus onset. (D) Eye movements distribution for a trial in monkey: This plot illustrates a distribution of eye movements recorded in a monkey participant while viewing a specific stimulus. The color intensity signifies the frequency of eye movement occurrences during the experiment, with brighter colors indicating higher frequencies of eye positions. (E) Eye movements distribution for a trial for a human subject: It presents a distribution of eye movements recorded in a human participant while viewing a specific stimulus. (F) A structured flowchart showing the superordinate and mid-level stimulus categories used in our study.
Figure 2
Figure 2
Validity of the detected saccades (A) Saccade amplitude distribution in monkeys: The distribution of saccade amplitudes observed in monkeys across all categories has a clear unimodal pattern, with the peak occurring at approximately 1.0 degrees of amplitude. (B) Saccade rate over time in monkeys: The distribution of detected saccade rates in monkeys across all categories over time, with a bin width of 15 ms, shows a distinct increase in the rate of microsaccades in 200–400 ms following stimulus presentation. This distribution demonstrates an unimodal pattern after stimulus onset. (C) Saccade rate over time in humans: The distribution of saccade rate across all categories obtained from human data, with a bin width of 15 ms, illustrates the similar distinct increase in the saccade rate around 200–400 ms after stimulus onset. (D) Saccade probability distribution by microsaccade duration: The distribution of saccade probability concerning microsaccade duration across all categories showcases an unimodal pattern, with the highest peak occurring at approximately 14 ms for microsaccade duration. Approximately 40% of the detected microsaccades are within the range of 10–25 ms duration. (E) Saccade probability distribution by peak velocity: The distribution of saccade probability concerning the peak velocity of detected saccades, including all categories, demonstrates an unimodal pattern with a peak value of approximately 30–40 (°s) for peak velocity. 50% of detected microsaccades are within the range of 25–50 (°s) in peak velocity. (F) Saccade rate amplitude by peak velocity: The distribution of saccade amplitude vs. the peak velocity across all categories illustrates a positive linear relationship.
Figure 3
Figure 3
Distinctive saccade rate patterns among categories for monkeys dataset (A) Temporal distribution of saccade rate of different categories: The saccade rate distribution per stimulus category reveals an increase in rates in 220 ms post-stimulus presentation, characterized by an unimodal distribution. (B) Microsaccade rates statistics across different categories over the entire trial time (900 ms): The mean microsaccade rates, represented with error bars indicating standandard deviation, exhibit distinct patterns across multiple categories, including animal, human, man-made, natural, animate, inanimate, face, and body stimuli, indicating the distinctness and discernibility of these categories from one another. Statistical significance was determined using a one-way ANOVA, with asterisks denoting significance levels: (p<0.05), (p<0.01), (p<0.001). (C) Category discrimination through the 2D-PCA representation of microsaccade rates: This figure demonstrates the microsaccade rates’ two-dimensional PCA representation across various stimulus categories. Extracted from their saccadic responses within a time window of 100–600 ms post-stimulus onset. It shows distinct separability among microsaccade rate distributions across different stimulus categories.
Figure 4
Figure 4
Time course of the performance of classifying categories with microsaccade rate employing an SVM classifier for monkeys’ dataset The mean and standard deviation of classification accuracy are illustrated for all cases throughout each trial. The accuracy distributions are unimodal, with increasing accuracy after stimulus onset and peaking around 200–400 ms after stimulus presentation. The bar plots depicted in each figure present the mean and standard deviation of the recall values within the time interval of 200–400 ms following the stimulus onset. (A) Animate vs. Inanimate: The accuracy of classifying between animate and inanimate categories has the highest accuracy, peaking at approximately 84%. (B) Face vs. Body: The classification accuracy for classifying between face and body categories has peak accuracy reaching approximately 75%. (C) Human vs. Animal: The classification accuracy between animal and human categories reaches its maximum accuracy at around 79%. (D) Man-made vs. Natural: The classification accuracy between man-made and natural categories has the highest accuracy reaching approximately 76%. (E) All categories: The classification accuracy for classifying between all categories reaches its highest point at approximately 74%.
Figure 5
Figure 5
Distinctive microsaccade rate patterns for human dataset (A) Saccade rate temporal distribution of different categories for the human dataset: The distribution of saccade rates for each stimulus category observed in the human data reveals a rise in saccade rates approximately 200 ms after stimulus presentation, characterized by a unimodal distribution primarily influenced by the stimulus presentation. Each bin represents a 15 ms interval. (B) Microsaccade rates statistics across different categories over the entire trial time (900 ms): The mean micorsaccade rates for various categories are represented with error bars indicating standard deviation, including all stimuli collected from human data, and exhibit variability across multiple categories. Statistical significance was determined using a one-way ANOVA, with asterisks denoting significance levels: (p<0.05), (p<0.01), (p<0.001).
Figure 6
Figure 6
Classification of the stimulus categories in the human dataset The mean and standard deviation of classification accuracy and recall value for all cases throughout each trial using a linear classifier. (A) Animate vs. Inanimate: Classifying between animate and inanimate categories has the highest accuracy, peaking at approximately 73%. (B) Face vs. Body: The classification accuracy for classifying face and body categories has peak accuracy reaching approximately 71%. (C) Human vs. Animal: The accuracy of classifying between animal and human categories reaches its maximum accuracy at around 71%. (D)Man-made vs. Natural: Classifying between man-made and natural categories has the highest accuracy, reaching approximately 72%. (E) The classification accuracy between all categories reached its highest point at approximately 69%.
Figure 7
Figure 7
Pupil size comparison and classification accuracy over time with neural data and saccade rate (A) Pupil size comparison across different stimulus categories for the monkey dataset. (B) Pupil size comparison across different stimulus categories for the human dataset. (C) Classification accuracy over time with neural data & saccade rate for Animate vs. Inanimate stimulus categories. (D) Classification accuracy over time with neural data and saccade rate for Face vs. Body stimulus categories. (E) Classification accuracy over time with neural data and saccade rate for Animal vs. Human stimulus categories. (F) Classification accuracy over time with neural data and saccade rate for Natural vs. Man-Made stimulus categories. The error bars indicate standard deviation of the mean value.

References

    1. Titchener S.A., Kvansakul J., Shivdasani M.N., Fallon J.B., Nayagam D.A.X., Epp S.B., Williams C.E., Barnes N., Kentler W.G., Kolic M., et al. Oculomotor responses to dynamic stimuli in a 44-channel suprachoroidal retinal prosthesis. Transl. Vis. Sci. Technol. 2020;9:31. - PMC - PubMed
    1. Veneri G., Piu P., Rosini F., Federighi P., Federico A., Rufa A. Automatic eye fixations identification based on analysis of variance and covariance. Pattern Recogn. Lett. 2011;32:1588–1593.
    1. Stacchi L., Ramon M., Lao J., Caldara R. Neural representations of faces are tuned to eye movements. J. Neurosci. 2019;39:4113–4123. - PMC - PubMed
    1. Mellor R.L., Psouni E. The study of security priming on avoidant attentional biases: combining microsaccadic eye-movement measurement with a dot-probe task. Front. Psychol. 2021;12 - PMC - PubMed
    1. Chen F., Hu Z., Liu H., Zhen F., Liu C., Li Q. Altered homotopic connectivity in the cerebellum predicts stereopsis dysfunction in patients with comitant exotropia. Front. Hum. Neurosci. 2022;16 - PMC - PubMed

LinkOut - more resources