Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Aug 1;175(8):827-836.
doi: 10.1001/jamapediatrics.2021.0530.

Computational Methods to Measure Patterns of Gaze in Toddlers With Autism Spectrum Disorder

Affiliations

Computational Methods to Measure Patterns of Gaze in Toddlers With Autism Spectrum Disorder

Zhuoqing Chang et al. JAMA Pediatr. .

Abstract

Importance: Atypical eye gaze is an early-emerging symptom of autism spectrum disorder (ASD) and holds promise for autism screening. Current eye-tracking methods are expensive and require special equipment and calibration. There is a need for scalable, feasible methods for measuring eye gaze.

Objective: Using computational methods based on computer vision analysis, we evaluated whether an app deployed on an iPhone or iPad that displayed strategically designed brief movies could elicit and quantify differences in eye-gaze patterns of toddlers with ASD vs typical development.

Design, setting, and participants: A prospective study in pediatric primary care clinics was conducted from December 2018 to March 2020, comparing toddlers with and without ASD. Caregivers of 1564 toddlers were invited to participate during a well-child visit. A total of 993 toddlers (63%) completed study measures. Enrollment criteria were aged 16 to 38 months, healthy, English- or Spanish-speaking caregiver, and toddler able to sit and view the app. Participants were screened with the Modified Checklist for Autism in Toddlers-Revised With Follow-up during routine care. Children were referred by their pediatrician for diagnostic evaluation based on results of the checklist or if the caregiver or pediatrician was concerned. Forty toddlers subsequently were diagnosed with ASD.

Exposures: A mobile app displayed on a smartphone or tablet.

Main outcomes and measures: Computer vision analysis quantified eye-gaze patterns elicited by the app, which were compared between toddlers with ASD vs typical development.

Results: Mean age of the sample was 21.1 months (range, 17.1-36.9 months), and 50.6% were boys, 59.8% White individuals, 16.5% Black individuals, 23.7% other race, and 16.9% Hispanic/Latino individuals. Distinctive eye-gaze patterns were detected in toddlers with ASD, characterized by reduced gaze to social stimuli and to salient social moments during the movies, and previously unknown deficits in coordination of gaze with speech sounds. The area under the receiver operating characteristic curve discriminating ASD vs non-ASD using multiple gaze features was 0.90 (95% CI, 0.82-0.97).

Conclusions and relevance: The app reliably measured both known and new gaze biomarkers that distinguished toddlers with ASD vs typical development. These novel results may have potential for developing scalable autism screening tools, exportable to natural settings, and enabling data sets amenable to machine learning.

PubMed Disclaimer

Conflict of interest statement

Conflict of Interest Disclosures: Dr Chang reports receiving royalties from Apple Inc from licensing during the conduct of the study. Dr Aiello reports receiving grants from the National Institutes of Health (NIH) during the conduct of the study and personal fees from Private Diagnostic Clinic outside the submitted work. Dr Baker reports having a patent pending (15141391) and royalties from Apple Inc. Dr Carpenter reports receiving grants from NIH during the conduct of the study, having a patent pending (15141391) for developing technology that has been licensed to Apple, Inc, and receiving financial compensation for it, along with Duke University. Dr Compton reports receiving grants from the National Institute of Mental Health (NIMH) and personal fees from Mursion, Inc during the conduct of the study. Dr Davis reports receiving grants from NIH during the conduct of the study and research support from Akili Interactive outside the submitted work. Mr Espinosa reports receiving grants from the National Institute of Child Health and Human Development (NICHD), grants from NIMH, and fees from Apple Storage during the conduct of the study, and had a patent pending (15141391) and royalties for technology licensed to Apple Inc. Mr Harris reports receiving grants from NIH during the conduct of the study, having a patent pending (15141391) for developing technology that has been licensed to Apple, Inc, and receiving financial compensation for it, along with Duke University. Dr Howard reports receiving personal fees from Roche outside the submitted work. Dr Kollins reports receiving grants from NIH during the conduct of the study. Dr Dawson reports receiving grants from NICHD (P50HD093074), NIMH (R01MH121329 and R01MH120093), The Marcus Foundation, and Simons Foundation; receiving support from Apple Inc for data storage; receiving software during the conduct of the study; receiving personal fees from Apple; being a consultant for Apple; developing technology and data related to the app that has been licensed to Apple Inc, from which both she and Duke University have benefited financially; receiving other from Janssen Scientific Advisory Board; receiving other from Akili Scientific Advisory Board; receiving personal fees from LabCorp Scientific Advisory Board, Roche Scientific Advisory Board, Tris Pharma Scientific Advisory Board; receiving consultant fees from the Gerson Lehrman Group, Guidepoint, Axial Ventures, and Teva Pharmaceutical; receiving other from DASIO CEO; and receiving book royalties from Guilford Press, Oxford University Press, and Springer Nature Press outside the submitted work. Dr Dawson also reports patents pending (1802952, 15141391, 1802942, and 16493754). Dr Sapiro reports receiving consultant fees from Apple at submission of the manuscript, speaker fees from Johnson & Johnson, nonfinancial support from DASIO (nonactive cofounder) during the conduct of the study, consultant fees from Restore3D on activities unrelated to this work, nonfinancial support from SIS as a board member (unrelated work), and consultant fees from Volvo on activities outside the submitted work. Dr Sapiro has a patent pending (15141391) for technology on behavioral coding and corresponding data licensed to Apple (Duke University has licensed technology to Apple), a patent for technology for video deblurring licensed to Adobe (licensed by Duke University) not related to this work, and a patent for technology for face recognition licensed to Coral (licensed by Duke University) not related to this work. After the manuscript was submitted, Dr Sapiro started a sabbatical and is currently working at Apple part-time. No other disclosures were reported.

Figures

Figure 1.
Figure 1.. Social vs Nonsocial Gaze Preference
Gaze data for 4 movies that depicted a person on one side of the screen playing with toys located on the opposite side of the screen. A, Distribution of percentage right scores (percentage of time spent looking at the right side of the screen) for each movie. B, Distribution of silhouette scores for each movie. C, Scatterplots displaying individual participant percentage right (horizontal axis) and silhouette scores (vertical axis). For “spinning pinwheel” (iPhone), the person is on the right side of the screen; for “blowing bubbles” (iPhone), the person is on the left side; for “spinning top” (iPad), the person is on the right side; and for “blowing bubbles” (iPad), the person is on the left side. ASD indicates autism spectrum disorder.
Figure 2.
Figure 2.. Gaze Patterns During Salient Social Moments
Children’s gaze behavior was measured during a salient segment of the movie, during which the person paused expectantly before enthusiastically blowing bubbles. Displayed are the percentage right scores reflecting attentional preference for the toys (right side) vs the person (left side) for toddlers with typical development (blue) and autism spectrum disorder (orange) during this interval of the movie (A) and the entire movie (B).
Figure 3.
Figure 3.. Gaze-Speech Coordination
Results from a movie designed to assess gaze patterns when the child watched a conversation between 2 people displayed on opposite sides of the screen (illustrated in eFigure 4 in the Supplement). A, Distributions of speech-gaze time correlations and silhouette scores and scatterplot jointly showing individual data for these 2 measures for children with autism spectrum disorder (ASD) (orange) and typical development (blue). B, Alternating speech signal between 2 women having a conversation shown in the movie. C, Alternative gaze patterns in relationship to the conversational speech sounds for a child with autism spectrum disorder and a child with typical development.
Figure 4.
Figure 4.. Gaze Shifts to Nonhuman Stimuli (Puppy)
In this control movie, a puppy’s face appeared twice on the right and once on the left. Shown are the mean (SD) gaze horizontal coordinates (lines represent means and shaded areas represent SDs) as the puppy’s face appeared in the right-right-left pattern on an iPad for children with typical development (blue) and autism spectrum disorder (ASD) (orange). On the vertical axis, 1 is right and 0 is left.
Figure 5.
Figure 5.. Combining Gaze Features
Receiver operating characteristic (ROC) curves and areas under the curve (AUCs) obtained for models trained on percentage right scores and gaze-speech correlations for each movie and their combination for the data collected on the iPhone (A) and iPad (B). Receiver operating characteristics appear nonsmooth due to the small number of levels in the model. On the right, the per-feature classification boundaries associated with the gaze features extracted from each movie are shown, as automatically computed by the classification strategy. A shallow tree was used to learn to classify from the data; the portions of the feature space associated with a higher risk of autism spectrum disorder (ASD) are shown colored.

References

    1. Kennedy DP, Adolphs R. The social brain in psychiatric and neurological disorders. Trends Cogn Sci. 2012;16(11):559-572. doi:10.1016/j.tics.2012.09.006 - DOI - PMC - PubMed
    1. Haxby JV, Hoffman EA, Gobbini MI. The distributed human neural system for face perception. Trends Cogn Sci. 2000;4(6):223-233. doi:10.1016/S1364-6613(00)01482-0 - DOI - PubMed
    1. Adolphs R. Cognitive neuroscience of human social behaviour. Nat Rev Neurosci. 2003;4(3):165-178. doi:10.1038/nrn1056 - DOI - PubMed
    1. Reynolds GD, Roth KC. The development of attentional biases for faces in infancy: a developmental systems perspective. Front Psychol. 2018;9:222. doi:10.3389/fpsyg.2018.00222 - DOI - PMC - PubMed
    1. Chawarska K, Macari S, Shic F. Decreased spontaneous attention to social scenes in 6-month-old infants later diagnosed with autism spectrum disorders. Biol Psychiatry. 2013;74(3):195-203. doi:10.1016/j.biopsych.2012.11.022 - DOI - PMC - PubMed

Publication types