Electrophysiological Indices of Audiovisual Speech Perception in the Broader Autism Phenotype
- PMID: 28574442
- PMCID: PMC5483633
- DOI: 10.3390/brainsci7060060
Electrophysiological Indices of Audiovisual Speech Perception in the Broader Autism Phenotype
Abstract
When a speaker talks, the consequences of this can both be heard (audio) and seen (visual). A novel visual phonemic restoration task was used to assess behavioral discrimination and neural signatures (event-related potentials, or ERP) of audiovisual processing in typically developing children with a range of social and communicative skills assessed using the social responsiveness scale, a measure of traits associated with autism. An auditory oddball design presented two types of stimuli to the listener, a clear exemplar of an auditory consonant-vowel syllable /ba/ (the more frequently occurring standard stimulus), and a syllable in which the auditory cues for the consonant were substantially weakened, creating a stimulus which is more like /a/ (the infrequently presented deviant stimulus). All speech tokens were paired with a face producing /ba/ or a face with a pixelated mouth containing motion but no visual speech. In this paradigm, the visual /ba/ should cause the auditory /a/ to be perceived as /ba/, creating an attenuated oddball response; in contrast, a pixelated video (without articulatory information) should not have this effect. Behaviorally, participants showed visual phonemic restoration (reduced accuracy in detecting deviant /a/) in the presence of a speaking face. In addition, ERPs were observed in both an early time window (N100) and a later time window (P300) that were sensitive to speech context (/ba/ or /a/) and modulated by face context (speaking face with visible articulation or with pixelated mouth). Specifically, the oddball responses for the N100 and P300 were attenuated in the presence of a face producing /ba/ relative to a pixelated face, representing a possible neural correlate of the phonemic restoration effect. Notably, those individuals with more traits associated with autism (yet still in the non-clinical range) had smaller P300 responses overall, regardless of face context, suggesting generally reduced phonemic discrimination.
Keywords: ERP; audiovisual speech perception; broader autism phenotype; development.
Conflict of interest statement
The authors declare no conflict of interest.
Figures






References
-
- Sumby W.H., Pollack I. Visual contribution to speech intelligibility in noise. J. Acoust. Soc. Am. 1954;26:211–215. doi: 10.1121/1.1907309. - DOI
-
- Burnham D., Dodd B. Familiarity and novelty preferences in infants’ auditory-visual speech perception: Problems, factors, and a solution. Adv. Inf. Res. 1998;12:171–187.
-
- Meltzoff A.N., Kuhl P.K. Faces and speech: Intermodal processing of biologically relevant signals in infants and adults. Dev. Intersens. Percept. 1994;19:331–369.
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous