Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Jul 6;14(1):15580.
doi: 10.1038/s41598-024-66312-6.

Artificial intelligence detects awareness of functional relation with the environment in 3 month old babies

Affiliations

Artificial intelligence detects awareness of functional relation with the environment in 3 month old babies

Massoud Khodadadzadeh et al. Sci Rep. .

Abstract

A recent experiment probed how purposeful action emerges in early life by manipulating infants' functional connection to an object in the environment (i.e., tethering an infant's foot to a colorful mobile). Vicon motion capture data from multiple infant joints were used here to create Histograms of Joint Displacements (HJDs) to generate pose-based descriptors for 3D infant spatial trajectories. Using HJDs as inputs, machine and deep learning systems were tasked with classifying the experimental state from which snippets of movement data were sampled. The architectures tested included k-Nearest Neighbour (kNN), Linear Discriminant Analysis (LDA), Fully connected network (FCNet), 1D-Convolutional Neural Network (1D-Conv), 1D-Capsule Network (1D-CapsNet), 2D-Conv and 2D-CapsNet. Sliding window scenarios were used for temporal analysis to search for topological changes in infant movement related to functional context. kNN and LDA achieved higher classification accuracy with single joint features, while deep learning approaches, particularly 2D-CapsNet, achieved higher accuracy on full-body features. For each AI architecture tested, measures of foot activity displayed the most distinct and coherent pattern alterations across different experimental stages (reflected in the highest classification accuracy rate), indicating that interaction with the world impacts the infant behaviour most at the site of organism~world connection.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Figure 1
Figure 1
(a) Infant marker layout. (b) Infant in crib with mobile above. A 3D reconstruction of the markers is overlaid on top of the video recording. All coloured lines represent the virtual skeleton of the ‘baby-mobile object’ being tracked. The coloured dots are the reflective markers. (c) A 3D representation of another infant without the video feed.
Figure 2
Figure 2
Schematic of the experimental paradigm. The experimental stages which manipulate the infant’s functional context proceed from left to right. See the Experiment description.
Figure 3
Figure 3
Overview of the feature extraction and classification steps.
Figure 4
Figure 4
Different frames from one video camera viewpoint of subject 1. Reflective markers are easily visible on this infant’s right leg.
Figure 5
Figure 5
3D reconstructed tracking during four different stages in the experiment: (a) spontaneous baseline-no mobile movement (b) uncoupled reactive baseline - experimenter moves mobile (c) tethered stage (d) decoupled stage.
Figure 6
Figure 6
x, y, and z coordinates of body landmarks.
Figure 7
Figure 7
The top panel shows an idealized progression of experimental stages depicted in different colours (red, yellow, green, and blue represent spontaneous baseline (B1), uncoupled reactive baseline (B2), tethered stage (CR) and untethered stage (DC), respectively). The entire procedure was planned to span 12 minutes. To investigate whether various AI classification architectures could differentiate between initial infant exploratory movement during tethered interaction from possible intentional activity later in the tethered phase, the current project split the tethered phase into two segments surrounding the most typical timing for infant discovery as indicated by coordination dynamics analysis (i.e., ~ 90s into coupling). These two segments were the first minute of infant~mobile coupling (CR1, shaded light green) and the third minute of coupling (CR2, shaded dark green). Outlined in black, we planned to classify movement using two minutes of data for each of B1, B2, CR and DC (with CR split into CR1 and CR2). The bottom panel illustrates the available data for each stage for each of the five infants (S1-S5) relative to the idealized plan. Sliding windows (window width = 5s; overlap = 1s) were used across all experimental stages to assess fluctuations in classification accuracy across time (indicated by small black windows drawn to scale and arrows).
Figure 8
Figure 8
Keypoint locations are recalculated in reference to the pelvis to produce person-centred data (left). The partitioning of the 3D space into n bins is referenced by vectors alpha and theta (right).
Figure 9
Figure 9
Temporal analysis for each subject. Five ascending steps illustrate the correct labels for five distinct classes: B1 (spontaneous baseline, no mobile motion), B2 (experimenter triggered mobile motion), CR1 (minute 1 of coupling), CR2 (2 minutes into coupling), and DC (decoupled, no mobile motion), respectively. Completely flat lines on each ascending step would indicate perfect labelling for each stage. 100 sliding steps are equivalent to 1s.
Figure 10
Figure 10
The moving average of the classification accuracy across time (s) for fused features (Full-body, Knees, Hands, and Feet). B1 (spontaneous baseline), B2 (uncoupled reactive baseline), CR1 (the first minute of the tethered stage), CR2 (~ 2 min. into the tethered stage), DC (decoupled stage).
Figure 11
Figure 11
The trajectory of the left (connected) foot in S1 as a function of experimental stage. Stages: B1 (spontaneous baseline), B2 (uncoupled reactive baseline), CR1 (the first minute of the tethered stage), CR2 (~ 2 min. into the tethered stage), DC (decoupled stage).
Figure 12
Figure 12
The trajectory of the right (connected) foot in S2 as a function of experimental stage. Stages: B1 (spontaneous baseline), B2 (uncoupled reactive baseline), CR1 (the first minute of the tethered stage), CR2 (~ 2 min. into the tethered stage), DC (decoupled stage).
Figure 13
Figure 13
The HJD for Full-body joints over 10s in the middle of each stage for S2. The x-axis is the number of the 64 bins used in the modified spherical coordinate system. The y-axis reflects the percentage of time the joint occupied a particular bin. Stages: B1 (spontaneous baseline), B2 (uncoupled reactive baseline), CR1 (the first minute of the tethered stage), CR2 (~ 2 min. into the tethered stage), DC (decoupled stage).

Similar articles

References

    1. Perconti P, Plebe A. Deep learning and cognitive science. Cognition. 2020;203:104365. doi: 10.1016/J.COGNITION.2020.104365. - DOI - PubMed
    1. Turing A. Intelligent machinery. B. Jack Copeland. 2004;395:63.
    1. Shin HI, et al. Deep learning-based quantitative analyses of spontaneous movements and their association with early neurological development in preterm infants. Sci. Rep. 2022;12:3138. doi: 10.1038/s41598-022-07139-x. - DOI - PMC - PubMed
    1. Reich S, et al. Novel ai driven approach to classify infant motor functions. Sci. Rep. 2021;11:9888. doi: 10.1038/s41598-021-89347-5. - DOI - PMC - PubMed
    1. Prechtl HF, Hopkins B. Developmental transformations of spontaneous movements in early infancy. Early Hum. Dev. 1986;14:233–238. doi: 10.1016/0378-3782(86)90184-2. - DOI - PubMed