Multisensory Integration of Naturalistic Speech and Gestures in Autistic Adults
- PMID: 40247672
- PMCID: PMC12166514
- DOI: 10.1002/aur.70042
Multisensory Integration of Naturalistic Speech and Gestures in Autistic Adults
Abstract
Seeing the speaker often facilitates auditory speech comprehension through audio-visual integration. This audio-visual facilitation is stronger under challenging listening conditions, such as in real-life social environments. Autism has been associated with atypicalities in integrating audio-visual information, potentially underlying social difficulties in this population. The present study investigated multisensory integration (MSI) of audio-visual speech information among autistic and neurotypical adults. Participants performed a speech-in-noise task in a realistic multispeaker social scenario with audio-visual, auditory, or visual trials while their brain activity was recorded using EEG. The neurotypical group demonstrated a non-linear audio-visual effect in alpha oscillations, whereas the autistic group showed merely additive processing. Despite these differences in neural correlates, both groups achieved similar behavioral audio-visual facilitation outcomes. These findings suggest that although autistic and neurotypical brains might process multisensory cues differently, they achieve comparable benefits from audio-visual speech. These results contribute to the growing body of literature on MSI atypicalities in autism.
Keywords: EEG; audio‐visual speech; autism; iconic gestures; multisensory integration.
© 2025 The Author(s). Autism Research published by International Society for Autism Research and Wiley Periodicals LLC.
Conflict of interest statement
The authors declare no conflicts of interest.
Figures
References
-
- Alcántara, J. I. , Weisblatt E. J. L., Moore B. C. J., and Bolton P. F.. 2004. “Speech‐in‐Noise Perception in High‐Functioning Individuals With Autism or Asperger's Syndrome.” Journal of Child Psychology and Psychiatry 45, no. 6: 1107–1114. - PubMed
-
- Alsius, A. , Möttönen R., Sams M. E., Soto‐Faraco S., and Tiippana K.. 2014. “Effect of Attentional Load on Audiovisual Speech Perception: Evidence From ERPs.” Frontiers in Psychology 5: 727. https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2.... - DOI - PMC - PubMed
-
- Alsius, A. , Navarra J., Campbell R., and Soto‐Faraco S.. 2005. “Audiovisual Integration of Speech Falters Under High Attention Demands.” Current Biology 15, no. 9: 839–843. - PubMed
-
- Alsius, A. , Navarra J., and Soto‐Faraco S.. 2007. “Attention to Touch Weakens Audiovisual Speech Integration.” Experimental Brain Research 183, no. 3: 399–404. - PubMed
-
- American Psychiatric Association . 2013. Diagnostic and Statistical Manual of Mental Disorders. 5th ed, 991. American Psychiatric Association. http://ajp.psychiatryonline.org/article.aspx?articleID=158714%5Cnhttp://....
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
