Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 Dec 9;29(49):15601-12.
doi: 10.1523/JNEUROSCI.2574-09.2009.

Dynamic reweighting of visual and vestibular cues during self-motion perception

Affiliations

Dynamic reweighting of visual and vestibular cues during self-motion perception

Christopher R Fetsch et al. J Neurosci. .

Abstract

The perception of self-motion direction, or heading, relies on integration of multiple sensory cues, especially from the visual and vestibular systems. However, the reliability of sensory information can vary rapidly and unpredictably, and it remains unclear how the brain integrates multiple sensory signals given this dynamic uncertainty. Human psychophysical studies have shown that observers combine cues by weighting them in proportion to their reliability, consistent with statistically optimal integration schemes derived from Bayesian probability theory. Remarkably, because cue reliability is varied randomly across trials, the perceptual weight assigned to each cue must change from trial to trial. Dynamic cue reweighting has not been examined for combinations of visual and vestibular cues, nor has the Bayesian cue integration approach been applied to laboratory animals, an important step toward understanding the neural basis of cue integration. To address these issues, we tested human and monkey subjects in a heading discrimination task involving visual (optic flow) and vestibular (translational motion) cues. The cues were placed in conflict on a subset of trials, and their relative reliability was varied to assess the weights that subjects gave to each cue in their heading judgments. We found that monkeys can rapidly reweight visual and vestibular cues according to their reliability, the first such demonstration in a nonhuman species. However, some monkeys and humans tended to over-weight vestibular cues, inconsistent with simple predictions of a Bayesian model. Nonetheless, our findings establish a robust model system for studying the neural mechanisms of dynamic cue reweighting in multisensory perception.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Stimuli and task. A, Top view of the three stimulus conditions used in the heading discrimination task: visual (optic flow only, indicated by the red expanding optic flow pattern), vestibular (platform motion only, indicated by the black arrows), and combined (optic flow and platform motion). In all conditions, the monkey was required to fixate a central target during the stimulus and then saccade to a rightward or leftward target at the end of each trial to indicate its perceived heading relative to straight forward (one-interval version) or relative to the first interval (two-interval version). The heading depicted in this schematic is straight forward (0°), and thus there would be no correct answer (monkey was rewarded randomly). B, Stimulus arrangement during cue-conflict trials in the one-interval version of the task (angles not to scale). Positive Δ (left) indicates visual to the right, vestibular to the left, and vice versa for negative Δ (middle). For a given Δ (right), heading angle was defined as the midpoint between the visual and vestibular heading trajectories, which were varied together in fine steps around straight forward (positive heading angle indicates rightward motion). C, Two-interval variant of the task, used in human subjects, in which the subject must judge the heading angle of the second stimulus relative to the first. The standard interval was always straight forward, except in conflict trials when the visual and vestibular heading would be displaced to the right and left of straight forward by Δ/2. The comparison heading varied in small steps around the standard and was always cue consistent. The order of presentation could be either standard first and comparison second (left) or vice versa (right).
Figure 2.
Figure 2.
Simulated data demonstrating the method for measuring actual cue weights. For simplicity, this example considers only the case of +Δ (visual to the right, vestibular to the left) in the one-interval version of the task. The leftmost psychometric curve would be observed if the subject were completely ignoring the vestibular cue (visual capture, or a vestibular weight of 0). This can be understood by thinking of the PSE (red dot) as the heading angle at which the subject “feels” that his motion was straight forward (and thus would make 50% rightward and 50% leftward decisions, assuming no choice bias). By definition, the heading angle at which this occurs is −Δ/2, and thus a (bias-adjusted) PSE of −Δ/2 maps onto a vestibular weight of 0 when Δ is positive, consistent with Equation 3. The opposite is true for the rightmost curve: the PSE (black dot) is +Δ/2, meaning that the ambiguous stimulus (perceived as straight forward) is the one in which the vestibular cue is pointing straight forward, corresponding to a vestibular weight of 1. If the PSE is 0 (cyan), it means the cues are weighted equally (weights equivalent to 0.5) and the subject estimated the heading angle to be the average of the two cues. Any PSE shift between −Δ/2 and +Δ/2 results in a vestibular weight that is scaled linearly between 0 and 1 (see axis below the abscissa). The analysis is identical for the two-interval task, except that the abscissa represents the comparison heading, and the direction of the expected PSE shift is reversed for a given weight (because the conflict is in the standard interval).
Figure 3.
Figure 3.
Example psychometric functions. The proportion of rightward decisions is plotted against heading angle, including all data from a single animal (monkey Y). Solid curves depict the best-fitting cumulative Gaussian function. For the single-cue conditions (A), separate curves are plotted for the vestibular condition (black circles) and visual condition at each of the four coherence levels tested (12, 24, 48, and 96%, coded as different shapes and progressively darker shades of red). For the combined condition data, separate plots (B–E) are made for each coherence level, and within each plot the data are separated by conflict angle [blue, Δ = −4° (visual to the left of vestibular); cyan, Δ = 0° (cue consistent); green, Δ = +4° (visual to the right of vestibular)]. Dashed curves represent the predicted psychometric functions for each Δ, based on the predicted cue weights computed from single-cue thresholds (Eq. 2). These dashed curves were given the predicted thresholds derived from Equation 1 and were offset to match the overall bias (represented by the cyan solid curve), consistent with the manner in which we computed actual cue weights (see Eq. 3 and associated text). The reversal of the green and blue curves from low (B, C) to high (D, E) coherence indicates a shift from vestibular dominance to visual dominance (for additional explanation, see Fig. 2 and Materials and Methods).
Figure 4.
Figure 4.
Summary of predicted and actual cue weights. Vestibular weight (1 − visual weight) is plotted as a function of visual motion coherence for each of five monkey subjects (A–E) and averaged across subjects (F). Predicted weights (open symbols) were computed from Equation 2 using single-cue thresholds, and actual weights (filled symbols) were computed from the shift of the PSE relative to the magnitude of cue conflict (see Eq. 3 and Fig. 2). Error bars in A–E represent 95% CIs computed using the following bootstrap procedure. Choice data were resampled across repetitions (with replacement) and refit 250 times to create distributions of the PSE and threshold for each psychometric function. We then drew 1000 random samples from these distributions to compute 1000 bootstraps of predicted and actual weight (Eqs. 2, 3) and computed the CIs directly from these bootstraps (percentile method). Similar CIs were obtained using error propagation (data not shown). Error bars in F represent ±SEM across subjects.
Figure 5.
Figure 5.
Summary of psychophysical thresholds. For each monkey, single-cue visual (red squares) and vestibular (black squares) discrimination thresholds are plotted against coherence, along with the predicted (blue open circles) and actual (blue filled circles) combined thresholds. Error bars for actual thresholds (single-cue and combined) represent 95% CIs from the psychometric fits themselves (Wichmann and Hill, 2001b), whereas for predicted combined thresholds, they represent bootstrapped CIs via the method described in the legend of Figure 4, except using Equation 1 instead of Equations 2 and 3.
Figure 6.
Figure 6.
Correlation between optimality in weights and thresholds. For each session, deviation from the optimal (predicted) vestibular weight was plotted as a function of the ratio of actual to predicted combined thresholds, color coded by monkey identity. The left and right halves of the plot contain sessions in which the animal performed better or worse than the prediction, respectively. The upper quadrants indicate vestibular over-weighting (visual under-weighting) and vice versa for the lower quadrants. The significant correlation (r = 0.375, p < 0.0001) implies that the over-weighting of the vestibular cue goes hand in hand with the inability to show improved discrimination performance in the combined condition.
Figure 7.
Figure 7.
Lack of an effect of binocular depth cues on weights and thresholds. Two monkeys were tested in additional sessions with stereo cues removed from the optic flow stimulus. Performance without disparity cues remained close to optimal predictions for both the weights (A) and thresholds (B–D).
Figure 8.
Figure 8.
Weights and thresholds for human subjects. The weights (A) and thresholds (B) are plotted in the same format as Figures 4 and 5, respectively, averaged across five human subjects (error bars represent ±SEM). Individual subject data are shown in supplemental Figure 6 (available at www.jneurosci.org as supplemental material).

Similar articles

Cited by

References

    1. Alais D, Burr D. The ventriloquist effect results from near-optimal bimodal integration. Curr Biol. 2004;14:257–262. - PubMed
    1. Battaglia PW, Jacobs RA, Aslin RN. Bayesian integration of visual and auditory signals for spatial localization. J Opt Soc Am A Opt Image Sci Vis. 2003;20:1391–1397. - PubMed
    1. Berthoz A, Pavard B, Young LR. Perception of linear horizontal self-motion induced by peripheral vision (linearvection) basic characteristics and visual-vestibular interactions. Exp Brain Res. 1975;23:471–489. - PubMed
    1. Bertin RJ, Berthoz A. Visuo-vestibular interaction in the reconstruction of travelled trajectories. Exp Brain Res. 2004;154:11–21. - PubMed
    1. Brandt T, Dichgans J, Koenig E. Perception of self-rotation (circular vection) induced by optokinetic stimuli. Pflugers Arch. 1972;332(Suppl 332):R398. - PubMed

Publication types