Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 May 5;371(1693):20150373.
doi: 10.1098/rstb.2015.0373.

Interpersonal predictive coding, not action perception, is impaired in autism

Affiliations

Interpersonal predictive coding, not action perception, is impaired in autism

T von der Lühe et al. Philos Trans R Soc Lond B Biol Sci. .

Abstract

This study was conducted to examine interpersonal predictive coding in individuals with high-functioning autism (HFA). Healthy and HFA participants observed point-light displays of two agents (A and B) performing separate actions. In the 'communicative' condition, the action performed by agent B responded to a communicative gesture performed by agent A. In the 'individual' condition, agent A's communicative action was substituted by a non-communicative action. Using a simultaneous masking-detection task, we demonstrate that observing agent A's communicative gesture enhanced visual discrimination of agent B for healthy controls, but not for participants with HFA. These results were not explained by differences in attentional factors as measured via eye-tracking, or by differences in the recognition of the point-light actions employed. Our findings, therefore, suggest that individuals with HFA are impaired in the use of social information to predict others' actions and provide behavioural evidence that such deficits could be closely related to impairments of predictive coding.

Keywords: high-functioning autism; intention recognition; predictive coding; social interaction.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Example of a communicative signal trial. Agent A points to an object to be picked up; agent B bends down and picks it up. Agent B was presented using limited-lifetime technique (six signal dots) and masked with temporally scrambled noise dots. The noise level displayed is the minimum allowed in the experiment (five noise dots). To provide a static depiction of the animated sequence, dots extracted from three different frames are superimposed and simultaneously represented; the silhouette depicting the human form was not visible in the stimulus display. (Adapted from [6]).
Figure 2.
Figure 2.
Schematic of trial structure. After seeing the stimuli during two intervals (interval 1 and 2)—separated by the presentation of a fixation cross (500 ms)—participants were asked to decide which interval contained agent B. The maximum response time was 2000 ms.
Figure 3.
Figure 3.
Mean sensitivity (d′) across groups and conditions. Error bars depict 95% confidence intervals.
Figure 4.
Figure 4.
Correlation between AQ and d′ in the COM condition across groups.
Figure 5.
Figure 5.
Bar graphs of correct responses during post-test questionnaire across groups. Error bars depict 95% confidence intervals. Question 1: ‘Did you see this action in the previous task?’ Question 2: ‘Are the two agents communicating or acting independently from one another?’ Question 3: ‘Which alternative best describes this action?’.

References

    1. Becchio C, Manera V, Sartori L, Cavallo A, Castiello U. 2012. Grasping intentions: from thought experiments to empirical evidence. Front. Hum. Neurosci. 6, 117 ( 10.3389/fnhum.2012.00117) - DOI - PMC - PubMed
    1. Zhu Q, Bingham GP. 2014. Seeing where the stone is thrown by observing a point-light thrower: perceiving the effect of action is enabled by information, not motor experience. Ecol. Psychol. 26, 229–261. ( 10.1080/10407413.2014.957969) - DOI
    1. Ansuini C, Cavallo A, Bertone C, Becchio C. 2015. Intentions in the brain: the unveiling of mister Hyde. Neuroscientist 21, 126–135. ( 10.1177/1073858414533827) - DOI - PubMed
    1. Manera V, Becchio C, Schouten B, Bara BG, Verfaillie K. 2011. Communicative interactions improve visual detection of biological motion. PLoS ONE 6, e14594 ( 10.1371/journal.pone.0014594) - DOI - PMC - PubMed
    1. Neri P, Luu JY, Levi DM. 2006. Meaningful interactions can enhance visual discrimination of human agents. Nat. Neurosci. 9, 1186–1192. ( 10.1038/nn1759) - DOI - PubMed