Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2010 Jun;20(3):353-60.
doi: 10.1016/j.conb.2010.04.009. Epub 2010 May 12.

Multisensory integration: resolving sensory ambiguities to build novel representations

Affiliations
Review

Multisensory integration: resolving sensory ambiguities to build novel representations

Andrea M Green et al. Curr Opin Neurobiol. 2010 Jun.

Abstract

Multisensory integration plays several important roles in the nervous system. One is to combine information from multiple complementary cues to improve stimulus detection and discrimination. Another is to resolve peripheral sensory ambiguities and create novel internal representations that do not exist at the level of individual sensors. Here we focus on how ambiguities inherent in vestibular, proprioceptive and visual signals are resolved to create behaviorally useful internal estimates of our self-motion. We review recent studies that have shed new light on the nature of these estimates and how multiple, but individually ambiguous, sensory signals are processed and combined to compute them. We emphasize the need to combine experiments with theoretical insights to understand the transformations that are being performed.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1
Evidence for a neural resolution to the tilt/translation ambiguity. Responses of (a) an otolith afferent, (b) a Purkinje cell in the nodulus/uvula region of the caudal vermis and (c) a neuron in the rostral fastigial nucleus during translation, tilt and combinations of these stimuli presented either in phase to double the net acceleration (“Tilt+Translation”) or out-of-phase to cancel it out (“Tilt-Translation”). Unlike otolith afferents (a), which provide ambiguous motion information because their responses always reflect net acceleration, nodulus/uvula Purkinje cells (b) selectivily encode translation [19**]. Deep cerebellar and vestibular nuclei cells (c) show intermediate responses, thus reflecting a partial solution to the ambiguity [10,20,21]. Neural data are replotted with permission from Angelaki et al. [10,50], Yakusheva et al. [19**] and Green et al. [20].
Fig. 2
Fig. 2
Schematic representation of the theoretical computations to solve the tilt/translation ambiguity. Head-centered angular velocity, ω (e.g., from the canals) combines nonlinearly (multiplicatively) with a current estimate of gravitational acceleration (i.e., tilt) to compute the rate of change of the gravity vector relative to the head (dg/dt = ×g where “×” represents a vector cross product). For small amplitude rotations, dg/dt represents the earth-horizontal component of rotation, ωEH (green). Integrating (∫) dg/dt and taking into account initial head orientation (e.g., from static otolith signals; dotted blue) yields an updated estimate of gravitational acceleration, g (orange; g = -∫ω×g dt). This g estimate can be combined with the net acceleration signal, a (blue; from the otoliths) to calculate translational acceleration, t (red). This schematic is based on solving the equation t=a-∫ω×g dt (e.g., see [22,23]).
Fig. 3
Fig. 3
Schematic illustration of the proposal of Roy and Cullen [29] for how active and passive head movements could be distinguished in vestibular neurons. During an active head movement, an efference copy of the neck motor command (red) is used to compute the expected sensory consequences of that command. This predicted signal is compared with the actual sensory feedback from neck proprioceptors (orange). When the actual sensory signal matches the prediction it is interpreted as being due to an active head movement and is used to generate a reafference “cancellation” signal (purple; output of “Actual/Predicted Comparison” box) which selectively suppresses vestibular signals that arise from self-generated movements. By matching active head velocity with a simultaneous passive head rotation in the opposite direction, Roy and Cullen [29] eliminated most or all of the sensory vestibular contribution (green) to central activities during active head movement, unmasking for the first time the presence of this “cancellation” signal.
Fig. 4
Fig. 4
Computations to estimate body motion by combining vestibular and neck proprioceptive signals. (a) Two required computational steps: 1) “Reference frame transformation” (left) transforms head-centered vestibular estimates of motion into a body-centered reference frame; vestibular signals must be combined non-linearly (multiplicatively) with static proprioceptive estimates of head-on-body position. 2) “Body motion computation” (right) involves combining vestibular estimates of motion with dynamic proprioceptive signals to distinguish body motion from head motion with respect to the body. For descriptive purposes the two sets of computations are illustrated serially as distinct processing stages. However, both computations could occur in tandem within the same populations of neurons. (b) Reference frame experiment in which head versus body-centered reference frames for encoding vestibular signals were dissociated by examining rostral fastigial neuron responses to pitch and roll rotation for different trunk re head orientations (i.e., trunk left, center and right). Blue and green boxes indicate rotations about common body-centered axes. Data replotted with permission from Kleine et al. [46] (c) Evidence for coding of body motion in the rostral fastigial nuclei [45**]. The example cell exhibited a robust response to body motion both during passive whole-body rotation that stimulated the semicircular canals (left) and during passive body-under-head rotation that stimulated neck proprioceptors, but did not respond to head-on-body rotation (right), illustrating that vestibular and prioprioceptive signals combined appropriately to distinguish body motion. Data replotted with permission from Brooks and Cullen [45**].

Similar articles

Cited by

References

    1. Stein BE, Stanford TR. Multisensory integration: current issues from the perspective of the single neuron. Nat Rev Neurosci. 2008;9:255–266. - PubMed
    1. Driver J, Noesselt T. Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron. 2008;57:11–23. - PMC - PubMed
    1. Bulkin DA, Groh JM. Seeing sounds: visual and auditory interactions in the brain. Curr Opin Neurobiol. 2006;16:415–419. - PubMed
    1. Angelaki DE, Gu Y, Deangelis GC. Multisensory integration: psychophysics, neurophysiology, and computation. Curr Opin Neurobiol. 2009 - PMC - PubMed
    1. Yuille AL, Bülthoff HH, editors. Bayesian decision theory and psychophysics. New York: Cambridge University Press; 1996.

Publication types