The integration processing of the visual and auditory information in videos of real-world events: an ERP study
- PMID: 19520145
- DOI: 10.1016/j.neulet.2009.05.082
The integration processing of the visual and auditory information in videos of real-world events: an ERP study
Abstract
In real life, the human brain usually receives information through visual and auditory channels and processes the multisensory information, but studies on the integration processing of the dynamic visual and auditory information are relatively few. In this paper, we have designed an experiment, where through the presentation of common scenario, real-world videos, with matched and mismatched actions (images) and sounds as stimuli, we aimed to study the integration processing of synchronized visual and auditory information in videos of real-world events in the human brain, through the use event-related potentials (ERPs) methods. Experimental results showed that videos of mismatched actions (images) and sounds would elicit a larger P400 as compared to videos of matched actions (images) and sounds. We believe that the P400 waveform might be related to the cognitive integration processing of mismatched multisensory information in the human brain. The results also indicated that synchronized multisensory information would interfere with each other, which would influence the results of the cognitive integration processing.
Similar articles
-
Cognitive integration of asynchronous natural or non-natural auditory and visual information in videos of real-world events: an event-related potential study.Neuroscience. 2011 Apr 28;180:181-90. doi: 10.1016/j.neuroscience.2011.01.066. Epub 2011 Feb 24. Neuroscience. 2011. PMID: 21310215
-
The influence of matching degrees of synchronous auditory and visual information in videos of real-world events on cognitive integration: an event-related potential study.Neuroscience. 2011 Oct 27;194:19-26. doi: 10.1016/j.neuroscience.2011.08.009. Epub 2011 Aug 10. Neuroscience. 2011. PMID: 21855611
-
Semantic integration of differently asynchronous audio-visual information in videos of real-world events in cognitive processing: an ERP study.Neurosci Lett. 2011 Jul 1;498(1):84-8. doi: 10.1016/j.neulet.2011.04.068. Epub 2011 May 5. Neurosci Lett. 2011. PMID: 21565250
-
Semantics and the multisensory brain: how meaning modulates processes of audio-visual integration.Brain Res. 2008 Nov 25;1242:136-50. doi: 10.1016/j.brainres.2008.03.071. Epub 2008 Apr 9. Brain Res. 2008. PMID: 18479672 Review.
-
Benefits of multisensory learning.Trends Cogn Sci. 2008 Nov;12(11):411-7. doi: 10.1016/j.tics.2008.07.006. Trends Cogn Sci. 2008. PMID: 18805039 Review.
Cited by
-
Neural initialization of audiovisual integration in prereaders at varying risk for developmental dyslexia.Hum Brain Mapp. 2017 Feb;38(2):1038-1055. doi: 10.1002/hbm.23437. Epub 2016 Oct 14. Hum Brain Mapp. 2017. PMID: 27739608 Free PMC article.
-
Your Brain on Comics: A Cognitive Model of Visual Narrative Comprehension.Top Cogn Sci. 2020 Jan;12(1):352-386. doi: 10.1111/tops.12421. Epub 2019 Apr 8. Top Cogn Sci. 2020. PMID: 30963724 Free PMC article.
-
Simulating reading acquisition: The link between reading outcome and multimodal brain signatures of letter-speech sound learning in prereaders.Sci Rep. 2018 May 8;8(1):7121. doi: 10.1038/s41598-018-24909-8. Sci Rep. 2018. PMID: 29740067 Free PMC article.
-
Automatic Processing of Emotional Words in the Absence of Awareness: The Critical Role of P2.Front Psychol. 2017 Apr 20;8:592. doi: 10.3389/fpsyg.2017.00592. eCollection 2017. Front Psychol. 2017. PMID: 28473785 Free PMC article.
-
When a hit sounds like a kiss: An electrophysiological exploration of semantic processing in visual narrative.Brain Lang. 2017 Jun;169:28-38. doi: 10.1016/j.bandl.2017.02.001. Epub 2017 Feb 24. Brain Lang. 2017. PMID: 28242517 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources