Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2010 Sep;14(9):400-10.
doi: 10.1016/j.tics.2010.06.008. Epub 2010 Aug 2.

The multifaceted interplay between attention and multisensory integration

Affiliations
Review

The multifaceted interplay between attention and multisensory integration

Durk Talsma et al. Trends Cogn Sci. 2010 Sep.

Abstract

Multisensory integration has often been characterized as an automatic process. Recent findings indicate that multisensory integration can occur across various stages of stimulus processing that are linked to, and can be modulated by, attention. Stimulus-driven, bottom-up mechanisms induced by crossmodal interactions can automatically capture attention towards multisensory events, particularly when competition to focus elsewhere is relatively low. Conversely, top-down attention can facilitate the integration of multisensory inputs and lead to a spread of attention across sensory modalities. These findings point to a more intimate and multifaceted interplay between attention and multisensory integration than was previously thought. We review developments in the current understanding of the interactions between attention and multisensory processing, and propose a framework that unifies previous, apparently discordant, findings.

PubMed Disclaimer

Figures

Figure 1
Figure 1. A framework for the interactions between multisensory integration and attention
(a) Sequence of Processing Steps. Inputs from the sense organs are thought to interact cross-modally at multiple phases of the processing pathways, including at very early stages [8]. Stimuli can be integrated automatically if a number of conditions are satisfied: (1) if initial saliency of one of the stimuli is at or above a critical threshold, preprocessing stages will attempt to spatio-temporally realign this stimulus with one of lesser salience [(2); spatio-temporal realignment]. The stimulus stream will then be monitored for congruence in stimulus patterns of the matched streams [(3); congruency detection]. If the realignment and/or congruency matching processes succeed, the neural responsiveness of brain areas in charge of processing the input streams will be increased [(4); recurrent stimulus driven sensitivity adjustments] to sustain the integration process. If stimuli cannot be realigned or when incongruency is detected, the sensory gain would tend to be decreased. Note that we consider these potential gain adjustments to be mainly stimulus driven, and therefore a reflection of the bottom-up driven shift part of the interaction between multisensory integration and attention. If none of the stimuli is of sufficient saliency, top-down attention may be necessary to set up an initial selection of to be-integrated-stimuli [(5);top-down sensory gain adjustments]. The resulting boost in sensory sensitivity due to a top-down gain manipulation might then be sufficient to initiate the processes of spatial temporal alignment and congruency matching that would otherwise not have occurred. In addition, top-down attention can modulate processing at essentially all of these stages of multisensory processing and integration. b) Three examples of interactive influences between multisensory integration and attention: (1) bottom-up multisensory integration, which can then drive a shift of attention, (2) the need for top-down attention for multisensory integration in the presence of many competing stimulus representations; (3) the spreading of attention across space and modality (visual to auditory).
Figure 2
Figure 2. Multisensory integration mechanisms affecting visual attention in a bottom-up driven fashion
(a) Cluttered displays containing a variable number of short line segments were presented. Display elements continuously changed color from green to red (or vice versa) at random moments and locations. A short tone pip could be presented simultaneously with the color change of the target element. Participants were required to detect and report the orientation of the target element, consisting of a horizontal or vertical line amongst ±45° tilted line distractors. (b) In the absence of a sound, search times increased linearly with the number of distractor items in the display (white squares). In contrast, when the sound was present, search times became much shorter and independent of set size, indicating that the target stimulus popped out of the background (black squares). (c). Search times as a function of the relative stimulus onset asynchrony (SOA) between color change of the target, and onset of the sound. Negative SOAs indicate that the tone preceded the visual target event, and positive SOAs indicate that the target event preceded the tone. (Data from condition with set size fixed at 48 elements). Adapted, with permission, from ref [24].
Figure 3
Figure 3. Effects of top-down spatial attention on audiovisual speech perception
(a) Layout of the experiment. Sounds were presented from one central location, while two visually and laterally presented streams of lip-moments were played. One of these streams was congruent with the auditory speech signals while the other stream was incongruent. (b) Attention was selectively oriented to either the left of right visual streams, either of which could, in turn, be congruent or incongruent with the auditory speech stimuli. (c) Attending to the congruent stimuli resulted in increases in activation in several brain areas that are typically associated with multisensory integration. These areas included the superior temporal sulcus, and superior colliculus (central panel), as well as large parts of the retinotopically organized visual areas V1 and V2. Adapted, with permission, from ref [39].
Figure 4
Figure 4. Spreading of attention across a multisensory object
(a) Experimental Design. Visual stimuli were flashed successively, and in random order, at left and right hemifields. On half of the trials, a task-irrelevant central tone was presented synchronously with the lateral visual stimulus. Participants were instructed to visually attend selectively to only one of the two locations. (b) ERP subtraction procedure and key results. ERPs elicited by attended visual stimuli that occurred alone were subtracted from ERPs elicited by attended visual stimuli that were accompanied by a tone, yielding the extracted ERP response to the central tones when they occurred in the context of an attended (lateral) visual stimulus. A similar subtraction procedure was applied to unattended-visual trials in order to yield the extracted ERP response to the central tones when they occurred in the context of an unattended (lateral) visual stimulus. An overlay of these two extracted ERPs showed that tones presented in the context of an attended visual stimulus elicited a prolonged negativity over fronto-central scalp areas beginning at around 200 ms poststimulus. This effect resembles the late auditory processing negativity, which is a hallmark neural effect elicited during intramodal auditory attention. (c) fMRI results from the same paradigm. These results, extracted using an analogous contrast logic, showed that auditory stimuli presented in the context of an attended auditory stimulus yielded an increase in brain activity in the auditory cortex, compared to the activation elicited by the same tone when it was presented in the context of an unattended visual stimulus. The enhanced processing of task-irrelevant auditory stimuli that occur simultaneously with an attended visual stimulus, even one occurring in a different location, suggests that the visual attention has spread across the components of the multisensory object to encompass the auditory part. Adapted, with permission, from ref [37]. Copyright © 2005, The National Academy of Sciences.

References

    1. Urbantschisch V. Über den Einfluss einer Sinneserrugun auf die übrigen Sinnesempfindungen. Arch gesch Psych. 1880;42:155–175.
    1. Spence C, Driver J. Cross-Modal Space and Cross-Modal attention. Oxford University Press; 2004.
    1. Stein B, Meredith MA. The merging of the senses. MIT Press; 1993.
    1. Wallace MT, et al. Multisensory integration in the superior colliculus of the alert cat. J Neurophys. 1998;80:1006–1010. - PubMed
    1. Meredith MA. On the neuronal basis for multisensory convergence: A brief overview. Cogn Brain Res. 2002;14:31–40. - PubMed

Publication types