Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2009 Nov 25;64(4):448-61.
doi: 10.1016/j.neuron.2009.11.010.

A vestibular sensation: probabilistic approaches to spatial perception

Affiliations
Review

A vestibular sensation: probabilistic approaches to spatial perception

Dora E Angelaki et al. Neuron. .

Abstract

The vestibular system helps maintain equilibrium and clear vision through reflexes, but it also contributes to spatial perception. In recent years, research in the vestibular field has expanded to higher-level processing involving the cortex. Vestibular contributions to spatial cognition have been difficult to study because the circuits involved are inherently multisensory. Computational methods and the application of Bayes theorem are used to form hypotheses about how information from different sensory modalities is combined together with expectations based on past experience in order to obtain optimal estimates of cognitive variables like current spatial orientation. To test these hypotheses, neuronal populations are being recorded during active tasks in which subjects make decisions based on vestibular and visual or somatosensory information. This review highlights what is currently known about the role of vestibular information in these processes, the computations necessary to obtain the appropriate signals, and the benefits that have emerged thus far.

PubMed Disclaimer

Figures

Fig. 1
Fig. 1. Visual/vestibular cue integration: theory and behavioral data
(A) According to the Bayesian framework, if the reliability of two cues, cue 1 (red) and cue 2 (blue), is equal, then when both cues are present, as in the bimodal condition (black), the posterior distribution should be narrower and the psychometric function steeper. (B) If the reliability of two cues, cue 1 (red) and cue 2 (blue) is not equal, the resultant bimodal (posterior) probability is shifted towards the most reliable cue. Similarly, the bimodal psychometric function in a cue-conflict experiment will be shifted towards the most reliable cue. (C) This theory was tested using a heading discrimination task with vestibular cues (red), visual cues (blue) and a bimodal condition (black). (D) Average behavioral thresholds in two monkeys; the bimodal threshold (black) was lower than both single cue thresholds (red=vestibular; blue=visual) and similar to the prediction from equation [3] (purple). Error bars represent standard errors. C and D are reprinted with permission from Gu Y, Angelaki DE, DeAngelis GC. Neural correlates of multisensory cue integration in macaque MSTd. Nat. Neurosci. 11: 1201–1210, 2008.
Fig. 2
Fig. 2. Optimal multisensory integration in multimodal MSTd cells
(A) `Congruent' cells have similar tuning for vestibular (red) and visual (blue) motion cues. (B) `Opposite' cells respond best to vestibular cues in one direction and visual cues in the opposite direction. (C), (D) Mean firing rates of the congruent and opposite MSTd cell during the heading discrimination task based on vestibular (red), visual (blue) or bimodal (black) cues. (E), (F) Neurometric functions for the congruent and opposite MSTd cell (same data as in E and F). (G), (H) Average neuronal thresholds for congruent cells and opposite cells in the bimodal condition (black) are compared with the single cue thresholds (red and blue) and with the prediction (equation [3], purple). Reprinted with permission from Gu Y, Angelaki DE, DeAngelis GC. Neural correlates of multisensory cue integration in macaque MSTd. Nat. Neurosci. 11: 1201–1210, 2008.
Fig. 3
Fig. 3. Schematic illustrating the influence of a zero inertial acceleration prior at low and high frequency linear accelerations
(A) During high frequency linear accelerations, canal cues can resolve the ambiguous otolith signal and distinguish when a translation has occurred. Thus, the vestibular translation likelihood (blue) is much narrower (most reliable) than the zero inertial acceleration prior (red), resulting in a posterior density function (black) that is little affected by the prior. (B) During low frequency linear accelerations, canal cues cannot disambiguate if the otolith activation is signaling a tilt or a translation, resulting in a vestibular translation likelihood that is much broader (larger variance) than the prior. As a result, the posterior (black trace) is pulled towards zero by the prior and otolith activation is being interpreted as a tilt relative to gravity.
Fig. 4
Fig. 4. Misperceptions of translation due to changes in tilt
(A) Off-vertical axis rotation (OVAR) rotates the subject about an axis that is not aligned with gravity. Thus the subject's head is constantly changing its orientation relative to gravity. (B) Subjects undergoing OVAR perceive that they are swaying around a cone. The head's orientation changes from nose-down (ND) to right-ear-down (RED) to nose-up (NU) to left-ear-down (LED), while facing the same direction in space. (C), (D) Over time, the subject's perception of rotation dissipates and, in an apparently compensatory fashion, the subject begins to perceive that they are translating. Replotted with permission from Vingerhoets RA, Van Gisbergen JA, Medendorp WP. Verticality perception during off-vertical axis rotation. J. Neurophysiol. 97: 3256–3268, 2007.
Fig. 5
Fig. 5. Accuracy and precision in the perception of vertical line orientation and vertical visual motion direction during static body tilts
A 2AFC task was implemented to gauge subjects' abilities to perceive either (1) how the direction of motion of a random dot pattern was moving relative to gravity (blue squares) or (2) how a visible line was oriented relative to gravity (red circles). Subjects' performances were quite accurate when upright (B – center of psychometric function is near zero on abscissa), but became inaccurate with large body tilts (A and C). Errors were always in the same direction as the body tilt. Note that systematic errors in accuracy (illustrated by the bias of the psychometric functions) were similar for motion perception and the visual vertical tasks. However, precision (illustrated by the slope of the psychometric function) was higher for the line orientation than the motion direction task (steeper psychometric functions reflect higher precision). Vertical dashed lines indicate performance at 50% CW reports. Replotted with permission from De Vrijer M, Medendorp WP, Van Gisbergen JA. Shared computational mechanism for tilt compensation accounts for biased verticality percepts in motion and pattern vision. J. Neurophysiol. 99: 915–913, 2007.
Fig. 6
Fig. 6. A Bayesian model explains the observed errors in body-tilt perception
In order to determine body orientation in space (φs), the brain takes into account the sensory likelihoods of the orientation of lines on the retina (φr) and head tilt (ρ), the latter of which is noisy (as indicated by thick white cloud around the mean). The brain also takes into account prior knowledge that head tilts around 0° are much more likely than head tilts far away from upright (prior box). The tilt likelihood and prior probabilities multiply to yield the posterior distribution (posterior box). The resultant orientation in space is determined by summing the retinal orientation (φr) with the posterior tilt estimate (β). The three functions in the dashed box can be related back to the three functions in Figs. 1A,B and 3. Replotted with permission from De Vrijer M, Medendorp WP, Van Gisbergen JA. Shared computational mechanism for tilt compensation accounts for biased verticality percepts in motion and pattern vision. J. Neurophysiol. 99: 915–913, 2007.

References

    1. Angelaki DE. Three-dimensional organization of otolith-ocular reflexes in rhesus monkeys. III. Responses to translation. J. Neurophysiol. 1998;80:680–695. - PubMed
    1. Angelaki DE. Eyes on target: what neurons must do for the vestibuloocular reflex during linear motion. J. Neurophysiol. 2004;92:20–35. - PubMed
    1. Angelaki DE, Cullen KE. Vestibular system: the many facets of a multimodal sense. Annu. Rev. Neurosci. 2008;31:125–150. - PubMed
    1. Angelaki DE, Hess BJ. Inertial representation of angular motion in the vestibular system of rhesus monkeys. II. Otolith-controlled transformation that depends on an intact cerebellar nodulus. J. Neurophysiol. 1995a;73:1729–1751. - PubMed
    1. Angelaki DE, Hess BJ. Lesion of the nodulus and ventral uvula abolish steady-state off-vertical axis otolith response. J. Neurophysiol. 1995b;73:1716–1720. - PubMed

Publication types