Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Jan 22;23(3):1262.
doi: 10.3390/s23031262.

Using Gaze for Behavioural Biometrics

Affiliations

Using Gaze for Behavioural Biometrics

Alessandro D'Amelio et al. Sensors (Basel). .

Abstract

A principled approach to the analysis of eye movements for behavioural biometrics is laid down. The approach grounds in foraging theory, which provides a sound basis to capture the uniqueness of individual eye movement behaviour. We propose a composite Ornstein-Uhlenbeck process for quantifying the exploration/exploitation signature characterising the foraging eye behaviour. The relevant parameters of the composite model, inferred from eye-tracking data via Bayesian analysis, are shown to yield a suitable feature set for biometric identification; the latter is eventually accomplished via a classical classification technique. A proof of concept of the method is provided by measuring its identification performance on a publicly available dataset. Data and code for reproducing the analyses are made available. Overall, we argue that the approach offers a fresh view on either the analyses of eye-tracking data and prospective applications in this field.

Keywords: Bayesian inference; behaviour characteristics; biometric recognition; eye movements; foraging theory; gaze identification; machine learning; stochastic processes; visual attention.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Gaze dynamics of three different observers while scrutinising for 2 secs the image and recorded through an eye-tracking device. The unfolding of gaze deployment of each observer is displayed as a coloured trajectory (blue, yellow and red, respectively) overlapping the viewed image. The onset of gaze unfolding is approximately located at the center of the image for all viewers. Image and eye-tracking data are publicly available from the FIFA dataset [15].
Figure 2
Figure 2
The dynamics of one observer’s gaze while visually foraging on the image landscape as recorded through an eye-tracking device. Left: the “raw data” represented as a time-sequence of spatial coordinates (red dots) displayed as the red trace overlapped on the viewed image. Right: the observer’s scan path, namely, the continuous raw data trace parsed into a discrete sequence of fixations (yellow disks) and saccades (segments between subsequent fixations); disk radius is proportional to fixation time. Image and eye-tracking data are publicly available from the FIFA dataset [15].
Figure 3
Figure 3
A zoom-in on fixations and saccades as recorded by high frequency eye-trackers.
Figure 4
Figure 4
The proposed approach at a glance.
Figure 5
Figure 5
Receiver Operating Characteristic (ROC) curve.
Figure 6
Figure 6
Confusion matrix (left) and Cumulative Match Score (CMS) curve (right).
Figure 7
Figure 7
Heatmap contrasting the model-based descriptors of observers’ visual behaviour. Each row of the map renders the summary vector v(id) for subject id; numeric values of vector components are associated to colours as depicted in the side bar.
Figure 8
Figure 8
Hinton diagram for the correlation coefficients calculated over the average feature vectors of the subjects. The square size is proportional to the magnitude of the correlation; the colour denotes the degree of the correlation coefficient in the range depicted in the side bar.
Figure 9
Figure 9
Kernel density estimation (KDE) plots for the distributions of the correlation coefficients calculated over all the stimuli for each subject (Intra-subject) and between subjects (Inter-subject). See text for explanation.
Figure 10
Figure 10
(a) Semi-supervised UMAP 2D projection of a subset of 20% of the original O-U parameters. (b) Semi-supervised UMAP 2D projection of a subset of 50% of the original O-U parameters. (c) Projection obtained by using the full feature vector. In the (a,b) cases, the subset of features is constructed by selecting features at random via uniform distribution sampling. Each point represents a subject (identified by a colour in the side bar) observing a specific image.
Figure 11
Figure 11
(a) Semi-supervised UMAP 2D projection of the concatenation of Fixations and Saccades O-U parameters. Each point represents the parameters extracted from one subject on an image of the FIFA dataset; the different point of the same color indicate the same subject observing different images. (b) UMAP 2D projection of vectors from the Score level fusion representation. Each point represents a subject (identified by colour in the side bar) observing a specific image.

Similar articles

Cited by

References

    1. Canosa R. Real-world vision: Selective perception and task. ACM Trans. Appl. Percept. 2009;6:11. doi: 10.1145/1498700.1498705. - DOI
    1. Cerf M., Frady E., Koch C. Faces and text attract gaze independent of the task: Experimental data and computer model. J. Vis. 2009;9:10. doi: 10.1167/9.12.10. - DOI - PubMed
    1. Faber M., Bixler R., D’Mello S.K. An automated behavioral measure of mind wandering during computerized reading. Behav. Res. Methods. 2018;50:134–150. doi: 10.3758/s13428-017-0857-y. - DOI - PubMed
    1. Zhang H., Anderson N.C., Miller K.F. Refixation patterns of mind-wandering during real-world scene perception. J. Exp. Psychol. Hum. Percept. Perform. 2021;47:36. doi: 10.1037/xhp0000877. - DOI - PubMed
    1. Lee H.H., Chen Z.L., Yeh S.L., Hsiao J.H., Wu A.Y. When eyes wander around: Mind-wandering as revealed by eye movement analysis with hidden Markov models. Sensors. 2021;21:7569. doi: 10.3390/s21227569. - DOI - PMC - PubMed

LinkOut - more resources