Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Feb 9;13(1):2376.
doi: 10.1038/s41598-023-29133-7.

Assessing the allocation of attention during visual search using digit-tracking, a calibration-free alternative to eye tracking

Affiliations

Assessing the allocation of attention during visual search using digit-tracking, a calibration-free alternative to eye tracking

Yidong Yang et al. Sci Rep. .

Abstract

Digit-tracking, a simple, calibration-free technique, has proven to be a good alternative to eye tracking in vision science. Participants view stimuli superimposed by Gaussian blur on a touchscreen interface and slide a finger across the display to locally sharpen an area the size of the foveal region just at the finger's position. Finger movements are recorded as an indicator of eye movements and attentional focus. Because of its simplicity and portability, this system has many potential applications in basic and applied research. Here we used digit-tracking to investigate visual search and replicated several known effects observed using different types of search arrays. Exploration patterns measured with digit-tracking during visual search of natural scenes were comparable to those previously reported for eye-tracking and constrained by similar saliency. Therefore, our results provide further evidence for the validity and relevance of digit-tracking for basic and applied research on vision and attention.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
An illustration of the task. Participants saw a green button on the screen center. After touching the button, they could see the image superimposed by the gaussian blur. Touching the screen can sharpen an area around the finger, simulating the central vision. To clearly illustrate the Gaussian blur, the size of clear area here was larger than the size participants observed during the task.
Figure 2
Figure 2
Examples of different search arrays before and after application of the blur filter. (a) Typical search arrays used in Experiment 1, including color feature search (1a, a red square as the target), orientation feature search (1b, a horizontal bar as the target), inefficient search with homogeneous (1c, a letter T as the target) and heterogeneous (1d, a letter T as the target) distractors. (b) Typical search arrays used in Experiment 2. The target was the boot, marked by a red rectangle here. Other objects were clothes in semantic related condition and birds in semantic unrelated condition.
Figure 3
Figure 3
The bar plot of proportion of no touch, confirmation and exploration trials in different experiments. Participants barely touched the screen when searching in arbitrary arrays of items. In contrast, search in natural scenes generated more exploration.
Figure 4
Figure 4
The box plot of log-transformed reaction time in semantic related and unrelated arrays. Participants spent more time to detect the target when the target was semantically related to the distractors, ***p < 0.001.
Figure 5
Figure 5
The stimuli and results of the Experiment 3a. (a) The same image superimposed by three different levels of blur. (b,c) Show how exploration distance and response time change along with the blur level.
Figure 6
Figure 6
Results of the Experiment 3b. (a) Examples of digit-track attention maps at different blur levels and the eye tracking attention map computed from the Ehinger et al. (b) The boxplot of correlation coefficients between eye-tracking maps and digit-tracking maps, **p < 0.01, ***p < 0.001. (c,d) show the prediction performance of different models on digit-tracking data and eye tracking data.
Figure 7
Figure 7
Comparisons of human peripheral visual acuity and the visual acuity simulated by digit-tracking in three experiments. Experiments 1 and 2 used the same blur level as Lio et al.. Experiment 3a tested 30 blur levels but only the first 11 levels were shown here. Experiment 3b used 5 blur levels selected from Experiment 3a. Purple areas represent the screen eccentricity.

References

    1. Fischer B, Breitmeyer B. Mechanisms of visual attention revealed by saccadic eye movements. Neuropsychologia. 1987;25:73–83. - PubMed
    1. Rayner K. The 35th Sir Frederick Bartlett Lecture: Eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol. 2009;62:1457–1506. - PubMed
    1. Yarbus AL. Eye movements during perception of complex objects. In: Yarbus AL, editor. Eye Movements and Vision. Springer US; 1967. pp. 171–211.
    1. Lio G, Fadda R, Doneddu G, Duhamel J, Sirigu A. Digit-tracking as a new tactile interface for visual perception analysis. Nat. Commun. 2019;10:5392. - PMC - PubMed
    1. Chen X, Zelinsky GJ. Real-world visual search is dominated by top-down guidance. Vis. Res. 2006;46:4118–4133. - PubMed

Publication types