Differences in the neural basis of automatic auditory and visual time perception: ERP evidence from an across-modal delayed response oddball task
- PMID: 20170647
- DOI: 10.1016/j.brainres.2010.02.040
Differences in the neural basis of automatic auditory and visual time perception: ERP evidence from an across-modal delayed response oddball task
Abstract
In our everyday lives, we need to process auditory and visual temporal information as efficiently as possible. Although automatic auditory time perception has been widely investigated using an index of the mismatch negativity (MMN), the neural basis of automatic visual time perception has been largely ignored. The present study investigated the automatic processing of auditory and visual time perception employing the cross-modal delayed response oddball paradigm. In the experimental condition, the standard stimulus was 200 ms and the deviant stimulus was 120 ms, which were exchanged in the control condition. Reaction time, accuracy, and event-related potential (ERP) data were measured when participants performed the duration discrimination task. The ERP results showed that the MMN, N2b, and P3 were elicited by an auditory deviant stimulus under the attention condition, while only the MMN was elicited under the inattention condition. The MMN was largest over the frontal and central sites, while the difference in MMN amplitude was not significant between under the attention and inattention condition. In contrast, the change-related positivity (CRP) and the visual mismatch negativity (vMMN) were elicited by the visual deviant stimulus under both the attention and inattention conditions. The CRP was largest over the occipito-temporal sites under the attention condition and over the fronto-central sites under the inattention condition. The difference in CRP amplitude was significant between the attention and inattention condition. The vMMN was largest over the parieto-occipital sites under the attention condition, and largest over the fronto-central sites under the inattention condition. The difference in vMMN amplitude was significant between the attention and inattention condition. Auditory MMN does not appear to be modulated by attention, whereas the visual CRP and the vMMN are modulated by attention. Therefore, the present study provides electrophysiological evidence for the existence of automatic visual time perception and supports an "attentional switch" hypothesis for a modality effect on duration judgments, such that auditory temporal information is processed relatively automatically, whereas visual temporal information processing requires controlled attention.
Copyright 2010 Elsevier B.V. All rights reserved.
Similar articles
-
The effect of visual task difficulty and attentional direction on the detection of acoustic change as indexed by the Mismatch Negativity.Brain Res. 2006 Mar 17;1078(1):112-30. doi: 10.1016/j.brainres.2005.12.125. Epub 2006 Feb 21. Brain Res. 2006. PMID: 16497283
-
Evidence for the auditory P3a reflecting an automatic process: elicitation during highly-focused continuous visual attention.Brain Res. 2007 Sep 19;1170:71-8. doi: 10.1016/j.brainres.2007.07.023. Epub 2007 Jul 20. Brain Res. 2007. PMID: 17692834
-
Human visual system automatically encodes sequential regularities of discrete events.J Cogn Neurosci. 2010 Jun;22(6):1124-39. doi: 10.1162/jocn.2009.21299. J Cogn Neurosci. 2010. PMID: 19583466
-
Evidence from auditory and visual event-related potential (ERP) studies of deviance detection (MMN and vMMN) linking predictive coding theories and perceptual object representations.Int J Psychophysiol. 2012 Feb;83(2):132-43. doi: 10.1016/j.ijpsycho.2011.10.001. Epub 2011 Oct 30. Int J Psychophysiol. 2012. PMID: 22047947 Review.
-
Auditory processing that leads to conscious perception: a unique window to central auditory processing opened by the mismatch negativity and related responses.Psychophysiology. 2011 Jan;48(1):4-22. doi: 10.1111/j.1469-8986.2010.01114.x. Epub 2010 Sep 29. Psychophysiology. 2011. PMID: 20880261 Review.
Cited by
-
Shared and distinct factors driving attention and temporal processing across modalities.Acta Psychol (Amst). 2014 Mar;147:42-50. doi: 10.1016/j.actpsy.2013.07.020. Epub 2013 Aug 24. Acta Psychol (Amst). 2014. PMID: 23978664 Free PMC article.
-
Predicting the unpredicted … brain response: A systematic review of the feature-related visual mismatch negativity (vMMN) and the experimental parameters that affect it.PLoS One. 2025 Feb 27;20(2):e0314415. doi: 10.1371/journal.pone.0314415. eCollection 2025. PLoS One. 2025. PMID: 40014603 Free PMC article.
-
Auditory and visual temporal sensitivity: evidence for a hierarchical structure of modality-specific and modality-independent levels of temporal information processing.Psychol Res. 2012 Jan;76(1):20-31. doi: 10.1007/s00426-011-0333-8. Epub 2011 Apr 3. Psychol Res. 2012. PMID: 21461936
-
Predicting an EEG-Based hypnotic time estimation with non-linear kernels of support vector machine algorithm.Cogn Neurodyn. 2024 Dec;18(6):3629-3646. doi: 10.1007/s11571-024-10088-y. Epub 2024 Mar 27. Cogn Neurodyn. 2024. PMID: 39712110
-
Novelty is not enough: laser-evoked potentials are determined by stimulus saliency, not absolute novelty.J Neurophysiol. 2013 Feb;109(3):692-701. doi: 10.1152/jn.00464.2012. Epub 2012 Nov 7. J Neurophysiol. 2013. PMID: 23136349 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Research Materials
Miscellaneous