Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Jun;55(4):1924-1941.
doi: 10.3758/s13428-022-01907-3. Epub 2022 Jul 5.

Head-mounted mobile eye-tracking in the domestic dog: A new method

Affiliations

Head-mounted mobile eye-tracking in the domestic dog: A new method

Madeline H Pelgrim et al. Behav Res Methods. 2023 Jun.

Abstract

Humans rely on dogs for countless tasks, ranging from companionship to highly specialized detection work. In their daily lives, dogs must navigate a human-built visual world, yet comparatively little is known about what dogs visually attend to as they move through their environment. Real-world eye-tracking, or head-mounted eye-tracking, allows participants to freely move through their environment, providing more naturalistic results about visual attention while interacting with objects and agents. In dogs, real-world eye-tracking has the potential to inform our understanding of cross-species cognitive abilities as well as working dog training; however, a robust and easily deployed head-mounted eye-tracking method for dogs has not previously been developed and tested. We present a novel method for real-world eye-tracking in dogs, using a simple head-mounted mobile apparatus mounted onto goggles designed for dogs. This new method, adapted from systems that are widely used in humans, allows for eye-tracking during more naturalistic behaviors, namely walking around and interacting with real-world stimuli, as well as reduced training time as compared to traditional stationary eye-tracking methods. We found that while completing a simple forced-choice treat finding task, dogs look primarily to the treat, and we demonstrated the accuracy of this method using alternative gaze-tracking methods. Additionally, eye-tracking revealed more fine-grained time course information and individual differences in looking patterns.

Keywords: Comparative cognition; Domestic dog; Eye-tracking; Visual attention.

PubMed Disclaimer

Conflict of interest statement

The authors declare that they have no conflict of interest. The experiment was approved by the University of Toronto’s University Animal Care Committee (UACC). Procedures were in accordance with Ontario’s Animals for Research Act, the federal Canadian Council on Animal Care and fully complied with the APA Ethical Standards for Use of Animals in Research.

Figures

Fig. 1
Fig. 1
a A dog wearing size XL goggles, in full eye-tracking equipment during a trial. b A dog wearing size M goggles, in full the eye-tracking gear with the LCD screen that was removed during participation
Fig. 2
Fig. 2
This full calibration procedure (af, seen here in the Yarbus software) is repeated at the start and end of each session. Each point provides known points where the gaze from the eye camera (top right) lines up to a specific known location from the scene camera’s field of view (as indicated by the yellow dots). The dog’s owner (g) immobilizes the head
Fig. 3
Fig. 3
The test-calibration and verifier points allow for the identified center of gaze, shown here as a blue cursor, to be checked for spatial accuracy. A An example of a good calibration: the center of gaze is in the correct place, there are enough calibration points (indicated here by small yellow and blue dots) spread across the field of view. The next test-calibration point can now be verified. B An example of a poor and incomplete calibration: the center of gaze is offset from the known true gaze location (the treat), and the calibration does not have enough identified points (small yellow and blue dots). More calibration points must be included before verification can begin
Fig. 4
Fig. 4
The six regions of interest coded during a trial from eye-tracking data. Room camera data were coded as falling into one of four ROIs, where ROIs 2 and 5 (treat-hand and treat-plate) and ROIs 3 and 6 (empty-hand and empty-plate) were combined to be coded as Treat or Empty, respectively
Fig. 5
Fig. 5
A comparison of the data collected from the room camera (top) and the eye-tracking data (bottom) for a single trial. This demonstrates the finer spatial and temporal resolution of the eye-tracking data. Eye-tracking is able to capture rapid, shorter-duration looks. In addition, eye-tracking captures moments where the dog’s point of regard (where they are actually looking) differs from their head orientation, as observed from the room camera, like the glances off-target seen here in gray. See Supplementary Materials for video of this trial (https://osf.io/v8tdj/?view_only=2513d2cafc164586be7cf2fb580c5c0f)
Fig. 6
Fig. 6
A comparison of the same moment in a trial from the two camera types. Left: Mid-trial data from the room camera. Tracking the head orientation from this angle suggests the dog is looking at the treat. Right. Mid-trial data from the eye-tracker. Using eye-tracking shows the dog was actually looking out the window
Fig. 7
Fig. 7
The proportion of time during the observation phase (start of trial until dogs were released to make their choice) that dogs spent looking to the various ROIs. The five dogs in this sample displayed varying search strategies, with some dogs not looking at a given ROI at all (i.e., CCL485, who never looked at the experimenter’s hands after the treat presentation). Incorrect trials, indicated with overhead asterisks (*), do not display a consistent pattern of looks across dogs. Proportions graphs for the full trial (start of trial until choice) for each dog are available in the Supplementary Materials

References

    1. Agnetta B, Hare B, Tomasello M. Cues to food location that domestic dogs (Canis familiaris) of different ages do and do not use. Animal Cognition. 2000;3(2):107–112. doi: 10.1007/s100710000070. - DOI
    1. Bensky, M. K., Gosling, S. D., & Sinn, D. L. (2013). The World from a Dog’s Point of View. In: Advances in the Study of Behavior (Vol. 45, pp. 209–406). 10.1016/B978-0-12-407186-5.00005-7
    1. Byosiere S-E, Feng LC, Chouinard PA, Howell TJ, Bennett PC. Relational concept learning in domestic dogs: Performance on a two-choice size discrimination task generalises to novel stimuli. Behavioural Processes. 2017;145:93–101. doi: 10.1016/j.beproc.2017.10.009. - DOI - PubMed
    1. Cook PF, Prichard A, Spivak M, Berns GS. Awake canine fMRI predicts dogs’ preference for praise vs food. Social Cognitive and Affective Neuroscience. 2016;11(12):1853–1862. doi: 10.1093/scan/nsw102. - DOI - PMC - PubMed
    1. Correia-Caeiro C, Guo K, Mills DS. Perception of dynamic facial expressions of emotion between dogs and humans. Animal Cognition. 2020;23(3):465–476. doi: 10.1007/s10071-020-01348-5. - DOI - PMC - PubMed

LinkOut - more resources