Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Mar 31;13(1):83-96.
doi: 10.5709/acp-0209-2. eCollection 2017.

Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?

Affiliations

Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?

Basil Wahn et al. Adv Cogn Psychol. .

Abstract

Human information processing is limited by attentional resources. That is, via attentional mechanisms, humans select a limited amount of sensory input to process while other sensory input is neglected. In multisensory research, a matter of ongoing debate is whether there are distinct pools of attentional resources for each sensory modality or whether attentional resources are shared across sensory modalities. Recent studies have suggested that attentional resource allocation across sensory modalities is in part task-dependent. That is, the recruitment of attentional resources across the sensory modalities depends on whether processing involves object-based attention (e.g., the discrimination of stimulus attributes) or spatial attention (e.g., the localization of stimuli). In the present paper, we review findings in multisensory research related to this view. For the visual and auditory sensory modalities, findings suggest that distinct resources are recruited when humans perform object-based attention tasks, whereas for the visual and tactile sensory modalities, partially shared resources are recruited. If object-based attention tasks are time-critical, shared resources are recruited across the sensory modalities. When humans perform an object-based attention task in combination with a spatial attention task, partly shared resources are recruited across the sensory modalities as well. Conversely, for spatial attention tasks, attentional processing does consistently involve shared attentional resources for the sensory modalities. Generally, findings suggest that the attentional system flexibly allocates attentional resources depending on task demands. We propose that such flexibility reflects a large-scale optimization strategy that minimizes the brain's costly resource expenditures and simultaneously maximizes capability to process currently relevant information.

Keywords: attentional blink; attentional resources; load theory; multiple object tracking; multisensory.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Overview of the main conclusions of the review. Green, blue, and red lines indicate distinct, partially shared, and shared attentional resources, respectively.

Similar articles

Cited by

References

    1. Ahveninen J., Jääskeläinen I. P., Raij T., Bonmassar G., Devore S., Hämäläinen M., … Belliveau J. W. Task-modulated “what” and “where” pathways in human auditory cortex. Proceedings of the National Academy of Sciences of the United States of America. 2006;103:14608–14613. - PMC - PubMed
    1. Alais D., Morrone C., Burr D. Separate attentional resources for vision and audition. Proceedings of the Royal Society B: Biological Sciences. 2006;273:1339–1345. - PMC - PubMed
    1. Alnæs D., Sneve M. H., Espeseth T., Endestad T., van de Pavert S. H. P., Laeng B. Pupil size signals mental effort deployed during multiple object tracking and predicts brain activity in the dorsal attention network and the locus coeruleus. Journal of Vision. 2014;14:1–20. - PubMed
    1. Alvarez G. A., Cavanagh P. Independent resources for attentional tracking in the left and right visual hemifields. Psychological Science. 2005;16:637–643. - PubMed
    1. Alvarez G. A., Franconeri S. L. How many objects can you track? Evidence for a resource-limited attentive tracking mechanism. Journal of Vision. 2007;7:1–10. - PubMed

LinkOut - more resources