Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Jun 30;13(7):1011.
doi: 10.3390/brainsci13071011.

Event-Related Potentials in Assessing Visual Speech Cues in the Broader Autism Phenotype: Evidence from a Phonemic Restoration Paradigm

Affiliations

Event-Related Potentials in Assessing Visual Speech Cues in the Broader Autism Phenotype: Evidence from a Phonemic Restoration Paradigm

Vanessa Harwood et al. Brain Sci. .

Abstract

Audiovisual speech perception includes the simultaneous processing of auditory and visual speech. Deficits in audiovisual speech perception are reported in autistic individuals; however, less is known regarding audiovisual speech perception within the broader autism phenotype (BAP), which includes individuals with elevated, yet subclinical, levels of autistic traits. We investigate the neural indices of audiovisual speech perception in adults exhibiting a range of autism-like traits using event-related potentials (ERPs) in a phonemic restoration paradigm. In this paradigm, we consider conditions where speech articulators (mouth and jaw) are present (AV condition) and obscured by a pixelated mask (PX condition). These two face conditions were included in both passive (simply viewing a speaking face) and active (participants were required to press a button for a specific consonant-vowel stimulus) experiments. The results revealed an N100 ERP component which was present for all listening contexts and conditions; however, it was attenuated in the active AV condition where participants were able to view the speaker's face, including the mouth and jaw. The P300 ERP component was present within the active experiment only, and significantly greater within the AV condition compared to the PX condition. This suggests increased neural effort for detecting deviant stimuli when visible articulation was present and visual influence on perception. Finally, the P300 response was negatively correlated with autism-like traits, suggesting that higher autistic traits were associated with generally smaller P300 responses in the active AV and PX conditions. The conclusions support the finding that atypical audiovisual processing may be characteristic of the BAP in adults.

Keywords: audiovisual; event-related potentials; phonemic restoration; speech perception.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Spectrograms of (a) /ba/ and (b) /a/.
Figure 1
Figure 1
Spectrograms of (a) /ba/ and (b) /a/.
Figure 2
Figure 2
Example of audiovisual (AV) face condition (a) where articulators are present and visible and pixelated (PX) face condition (b) where pixels cover the articulators. Written informed consent was obtained for the publication of the identifiable image.
Figure 3
Figure 3
Electrode montage. Electrodes contributing to the N100 component cluster are highlighted with blue circles. Electrodes contributing to the P300 component cluster are highlighted with red circles.
Figure 4
Figure 4
ERP scalp maps for all conditions.
Figure 5
Figure 5
N100 ERPs in the frontocentral electrode cluster of (a) grand mean waveforms and (b) averaged amplitudes. (a) Grand mean waveforms for all conditions in the frontocentral electrode cluster. The time window over which the N100 component was averaged (75–125 ms) is highlighted in gray. (b) N100 amplitudes for all participants and conditions, averaged over the frontocentral electrode cluster and the highlighted time window. Each point represents one participant’s mean condition. Gray horizontal lines represent sample means; gray vertical lines represent 95% confidence intervals.
Figure 5
Figure 5
N100 ERPs in the frontocentral electrode cluster of (a) grand mean waveforms and (b) averaged amplitudes. (a) Grand mean waveforms for all conditions in the frontocentral electrode cluster. The time window over which the N100 component was averaged (75–125 ms) is highlighted in gray. (b) N100 amplitudes for all participants and conditions, averaged over the frontocentral electrode cluster and the highlighted time window. Each point represents one participant’s mean condition. Gray horizontal lines represent sample means; gray vertical lines represent 95% confidence intervals.
Figure 6
Figure 6
P300 ERPs in the medioparietal electrode cluster of (a) grand mean waveforms and (b) averaged amplitudes. (a) Grand mean waveforms for all conditions in the medioparietal electrode cluster. The time window over which the P300 component was averaged (350–550 ms) is highlighted in gray. (b) P300 amplitudes for all participants and conditions, averaged over the medioparietal electrode cluster and the highlighted time window. Each point represents one participant’s mean condition. Gray horizontal lines represent sample means; gray vertical lines represent 95% confidence intervals.
Figure 6
Figure 6
P300 ERPs in the medioparietal electrode cluster of (a) grand mean waveforms and (b) averaged amplitudes. (a) Grand mean waveforms for all conditions in the medioparietal electrode cluster. The time window over which the P300 component was averaged (350–550 ms) is highlighted in gray. (b) P300 amplitudes for all participants and conditions, averaged over the medioparietal electrode cluster and the highlighted time window. Each point represents one participant’s mean condition. Gray horizontal lines represent sample means; gray vertical lines represent 95% confidence intervals.
Figure 7
Figure 7
Associations between P300 amplitudes and SRS-2 scores. Each point represents one participant’s average component amplitude in one condition. Error ribbon = 95% CI. Correlations shown for visualization purposes only; significance testing was performed using mixed-effects models.

References

    1. Desjardins R.N., Rogers J., Werker J.F. An Exploration of Why Preschoolers Perform Differently Than Do Adults in Audiovisual Speech Perception Tasks. J. Exp. Child Psychol. 1997;66:85–110. doi: 10.1006/jecp.1997.2379. - DOI - PubMed
    1. Irwin J., DiBlasi L. Audiovisual speech perception: A new approach and implications for clinical populations. Lang. Linguist. Compass. 2017;11:77–91. doi: 10.1111/lnc3.12237. - DOI - PMC - PubMed
    1. Lachs L., Pisoni D.B., Kirk K.I. Use of Audiovisual Information in Speech Perception by Prelingually Deaf Children with Cochlear Implants: A First Report. Ear Hear. 2001;22:236–251. doi: 10.1097/00003446-200106000-00007. - DOI - PMC - PubMed
    1. Meltzoff A.N., Kuhl P.K. Faces and speech: Intermodal processing of biologically relevant signals in infants and adults. In: Lewkowicz D.J., Lickliter R., editors. The Development of Intersensory Perception: Comparative Perspectives. Lawrence Erlbaum Associates; Hillsdale, NJ, USA: 1994. pp. 335–369.
    1. Banks B., Gowen E., Munro K.J., Adank P. Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech. J. Speech, Lang. Hear. Res. 2021;64:3432–3445. doi: 10.1044/2021_JSLHR-21-00106. - DOI - PubMed

LinkOut - more resources