Integrating audiovisual information for the control of overt attention
- PMID: 17997680
- DOI: 10.1167/7.10.11
Integrating audiovisual information for the control of overt attention
Abstract
In everyday life, our brains decide about the relevance of huge amounts of sensory input. Further complicating this situation, this input is distributed over different modalities. This raises the question of how different sources of information interact for the control of overt attention during free exploration of the environment under natural conditions. Different modalities may work independently or interact to determine the consequent overt behavior. To answer this question, we presented natural images and lateralized natural sounds in a variety of conditions and we measured the eye movements of human subjects. We show that, in multimodal conditions, fixation probabilities increase on the side of the image where the sound originates showing that, at a coarser scale, lateralized auditory stimulation topographically increases the salience of the visual field. However, this shift of attention is specific because the probability of fixation of a given location on the side of the sound scales with the saliency of the visual stimulus, meaning that the selection of fixation points during multimodal conditions is dependent on the saliencies of both auditory and visual stimuli. Further analysis shows that a linear combination of both unimodal saliencies provides a good model for this integration process, which is optimal according to information-theoretical criteria. Our results support a functional joint saliency map, which integrates different unimodal saliencies before any decision is taken about the subsequent fixation point. These results provide guidelines for the performance and architecture of any model of overt attention that deals with more than one modality.
Similar articles
-
Interactions between voluntary and stimulus-driven spatial attention mechanisms across sensory modalities.J Cogn Neurosci. 2009 Dec;21(12):2384-97. doi: 10.1162/jocn.2008.21178. J Cogn Neurosci. 2009. PMID: 19199406
-
ERP evidence for cross-modal audiovisual effects of endogenous spatial attention within hemifields.J Cogn Neurosci. 2004 Mar;16(2):272-88. doi: 10.1162/089892904322984562. J Cogn Neurosci. 2004. PMID: 15068597 Clinical Trial.
-
Differences of monkey and human overt attention under natural conditions.Vision Res. 2006 Apr;46(8-9):1194-209. doi: 10.1016/j.visres.2005.08.032. Epub 2005 Dec 20. Vision Res. 2006. PMID: 16375943
-
Attention coordination and anticipatory control.Int Rev Neurobiol. 1997;41:575-98. doi: 10.1016/s0074-7742(08)60371-2. Int Rev Neurobiol. 1997. PMID: 9378609 Review.
-
On the identification of repeatedly presented, brief visual stimuli.Psychol Bull. 1972 Aug;78(2):142-54. doi: 10.1037/h0032941. Psychol Bull. 1972. PMID: 4558986 Review. No abstract available.
Cited by
-
Machine learning accurately classifies age of toddlers based on eye tracking.Sci Rep. 2019 Apr 18;9(1):6255. doi: 10.1038/s41598-019-42764-z. Sci Rep. 2019. PMID: 31000762 Free PMC article.
-
The contributions of image content and behavioral relevancy to overt attention.PLoS One. 2014 Apr 15;9(4):e93254. doi: 10.1371/journal.pone.0093254. eCollection 2014. PLoS One. 2014. PMID: 24736751 Free PMC article.
-
An extensive dataset of eye movements during viewing of complex images.Sci Data. 2017 Jan 31;4:160126. doi: 10.1038/sdata.2016.126. Sci Data. 2017. PMID: 28140391 Free PMC article.
-
Spatial orienting in complex audiovisual environments.Hum Brain Mapp. 2014 Apr;35(4):1597-614. doi: 10.1002/hbm.22276. Epub 2013 Apr 24. Hum Brain Mapp. 2014. PMID: 23616340 Free PMC article.
-
A spatially collocated sound thrusts a flash into awareness.Front Integr Neurosci. 2015 Feb 27;9:16. doi: 10.3389/fnint.2015.00016. eCollection 2015. Front Integr Neurosci. 2015. PMID: 25774126 Free PMC article.
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources