Computational approaches to understanding interaction and development
- PMID: 35249682
- PMCID: PMC9840818
- DOI: 10.1016/bs.acdb.2021.12.002
Computational approaches to understanding interaction and development
Abstract
Audio-visual recording and location tracking produce enormous quantities of digital data with which researchers can document children's everyday interactions in naturalistic settings and assessment contexts. Machine learning and other computational approaches can produce replicable, automated measurements of these big behavioral data. The economies of scale afforded by repeated automated measurements offer a potent approach to investigating linkages between real-time behavior and developmental change. In our work, automated measurement of audio from child-worn recorders-which quantify the frequency of child and adult speech and index its phonemic complexity-are paired with ultrawide radio tracking of children's location and interpersonal orientation. Applications of objective measurement indicate the influence of adult behavior in both expert ratings of attachment behavior and ratings of autism severity, suggesting the role of dyadic factors in these "child" assessments. In the preschool classroom, location/orientation measures provide data-driven measures of children's social contact, fertile ground for vocal interactions. Both the velocity of children's movement toward one another and their social contact with one another evidence homophily: children with autism spectrum disorder, other developmental disabilities, and typically developing children were more likely to interact with children in the same group even in inclusive preschool classrooms designed to promote interchange between all children. In the vocal domain, the frequency of peer speech and the phonemic complexity of teacher speech predict the frequency and phonemic complexity of children's own speech over multiple timescales. Moreover, children's own speech predicts their assessed language abilities across disability groups, suggesting how everyday interactions facilitate development.
Keywords: Audio; Automated measurement; Deep learning; Development; Interaction; Language; Machine learning; Objective; Radio frequency identification; Social.
Copyright © 2022 Elsevier Inc. All rights reserved.
Figures
References
-
- Abduallah M, Quian K, Elhoseiny M, & Claudel C (2020). Social-stgcnn: A social spatio-temporal graph convolutional neural network for human trajectory prediction. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 14424–14432.
-
- Afrabandpey H, Peltola T, Piironen J, Vehtari A, & Kaski S (2020). A decision-theoretic approach for model interpretability in Bayesian framework. Machine Learning, 109(9), 1855–1867.
-
- Ahn YA, Moffitt J, Custode S, Beaumont A, Cardona S, Parlade M, Durocher J, Hale M, Alessandri M, Perry L, & Messinger D (2021, May). Associations between objectively measured facial expressions during the ADOS-2 and the calibrated severity scores in 3-year-olds with suspected ASD [Poster presentation]. The International Society for Autism Research (INSAR) 2021 Annual Meeting. (Virtual Conference).
-
- Ainsworth MD (1985). Patterns of attachment. Clinical Psychologist, 38(2), 27–29.
-
- Alamdari N, Azarang A, & Kehtarnavaz N (2021). Improving deep speech denoising by Noisy2Noisy signal mapping. Applied Acoustics, 17, 107631.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Medical
