The influence of matching degrees of synchronous auditory and visual information in videos of real-world events on cognitive integration: an event-related potential study
- PMID: 21855611
- DOI: 10.1016/j.neuroscience.2011.08.009
The influence of matching degrees of synchronous auditory and visual information in videos of real-world events on cognitive integration: an event-related potential study
Abstract
In this article, we aim to study the influence of matching degrees of synchronous natural auditory and visual information on cognitive integration. Videos with matched, moderately matched, and mismatched audio-visual information were used as stimuli. The results showed that videos with moderately matched audio-visual information could elicit N400, P600, and late negativity (LN) effects, while videos with mismatched audio-visual information could elicit N400 and late negativity effects as compared with those with matched audio-visual information. It was further proven that N400 might reflect the connection process during multisensory integration, and P600 was more related to the evaluation process on the matching degrees of the audio-visual information in videos. Late negativity under the mismatched condition might be the combination of late frontal negativity (LFN) and late posterior negativity (LPN), which reflected the attention reallocating process and the recognition process, while late negativity under the moderately matched condition might be the LPN, which was related to the recognition process in the human brain. It was demonstrated that cognitive integration of synchronous audio-visual information would be modulated by different matching degrees of audio-visual information as indexed by different event-related potential (ERP) effects.
Copyright © 2011 IBRO. Published by Elsevier Ltd. All rights reserved.
Similar articles
-
Cognitive integration of asynchronous natural or non-natural auditory and visual information in videos of real-world events: an event-related potential study.Neuroscience. 2011 Apr 28;180:181-90. doi: 10.1016/j.neuroscience.2011.01.066. Epub 2011 Feb 24. Neuroscience. 2011. PMID: 21310215
-
Semantic integration of differently asynchronous audio-visual information in videos of real-world events in cognitive processing: an ERP study.Neurosci Lett. 2011 Jul 1;498(1):84-8. doi: 10.1016/j.neulet.2011.04.068. Epub 2011 May 5. Neurosci Lett. 2011. PMID: 21565250
-
Semantic association of ecologically unrelated synchronous audio-visual information in cognitive integration: an event-related potential study.Neuroscience. 2011 Sep 29;192:494-9. doi: 10.1016/j.neuroscience.2011.05.072. Epub 2011 Jun 29. Neuroscience. 2011. PMID: 21722711
-
Semantics and the multisensory brain: how meaning modulates processes of audio-visual integration.Brain Res. 2008 Nov 25;1242:136-50. doi: 10.1016/j.brainres.2008.03.071. Epub 2008 Apr 9. Brain Res. 2008. PMID: 18479672 Review.
-
[Ventriloquism and audio-visual integration of voice and face].Brain Nerve. 2012 Jul;64(7):771-7. Brain Nerve. 2012. PMID: 22764349 Review. Japanese.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources