Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2023 Sep 13:14:1266513.
doi: 10.3389/fneur.2023.1266513. eCollection 2023.

Temporal and spatial properties of vestibular signals for perception of self-motion

Affiliations
Review

Temporal and spatial properties of vestibular signals for perception of self-motion

Bingyu Liu et al. Front Neurol. .

Abstract

It is well recognized that the vestibular system is involved in numerous important cognitive functions, including self-motion perception, spatial orientation, locomotion, and vector-based navigation, in addition to basic reflexes, such as oculomotor or body postural control. Consistent with this rationale, vestibular signals exist broadly in the brain, including several regions of the cerebral cortex, potentially allowing tight coordination with other sensory systems to improve the accuracy and precision of perception or action during self-motion. Recent neurophysiological studies in animal models based on single-cell resolution indicate that vestibular signals exhibit complex spatiotemporal dynamics, producing challenges in identifying their exact functions and how they are integrated with other modality signals. For example, vestibular and optic flow could provide congruent and incongruent signals regarding spatial tuning functions, reference frames, and temporal dynamics. Comprehensive studies, including behavioral tasks, neural recording across sensory and sensory-motor association areas, and causal link manipulations, have provided some insights into the neural mechanisms underlying multisensory self-motion perception.

Keywords: heading; optic flow; rotation–linear motion; self-motion; vestibular.

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
Schematic summary of vestibular signals broadly distributed across multiple regions in the brain. Cortical areas largely dominated by vestibular signals are labeled blue, while those that are multisensory (e.g., vestibular and visual) are shown in green. The vestibular tuning properties are depicted above the lines, with white denoting acceleration and black denoting velocity. The reference frames of the different areas are illustrated below the lines, with the varied colors signifying distinct reference frame preferences.
Figure 2
Figure 2
Experimental setup and near-optimal multisensory integration performance. (A) The virtual reality experimental setup of the heading discrimination task. Vestibular cues were provided by a 6-degree-of-freedom motion platform. A visual display is mounted on the platform, providing visual stimuli that simulate real motion. (B) Behavioral performance of one monkey in the vestibular-only, visual-only and visual-vestibular combined heading discrimination tasks. Note that the performance (threshold) of this monkey in the combined condition is near-optimal according to the prediction from the Bayesian integration theory (the inset figure).
Figure 3
Figure 3
Hypothesis of multisensory integration mechanisms. Schematic diagram of the hypothesis of the multisensory integration mechanism. In the early integration model, vestibular (from the PIVC) and optic flow signals are first integrated in the sensory area, the MSTd, then transmitted to high-level decision-making areas. In the late integration model, the two heading signals do not converge until they are transmitted to decision-related areas, such as the LIP or FEFsac.
Figure 4
Figure 4
Manipulating temporal offset in the visuovestibular inputs and hypothetical behavioral outputs. (A) Fisher information in vestibular, visual and combined conditions when the temporal offset between the two heading cues is zero (up row) or nonzero (bottom row). Black arrows indicate hypothetical readout time. (B) Predicted psychophysical threshold for each model output. For the final-readout model, the results are the same for either the temporal-congruent or incongruent model; thus, only one case is shown (temporal incongruent model). (C) The performance of two monkeys shows improved heading performance during cue-combined conditions when the visual input is artificially adjusted to lead the vestibule by 250–500 ms. At the bottom of the plots, corresponding velocity and acceleration profiles of visual and vestibular cues are shown for each temporal-offset condition. Redrawn with permission from Zheng et al. (33).

Similar articles

References

    1. Brandt T, Schautzer F, Hamilton DA, Brüning R, Markowitsch HJ, Kalla R, et al. . Vestibular loss causes hippocampal atrophy and impaired spatial memory in humans. Brain. (2005) 128:2732–41. doi: 10.1093/brain/awh617, PMID: - DOI - PubMed
    1. Etienne AS, Jeffery KJ. Path integration in mammals. Hippocampus. (2004) 14:180–92. doi: 10.1002/hipo.10173, PMID: - DOI - PubMed
    1. Gallistel CR. The organization of learning. Cambridge, MA: The MIT Press; (1990). 648 p.
    1. Valerio S, Taube JS. Path integration: how the head direction signal maintains and corrects spatial orientation. Nat Neurosci. (2012) 15:1445–53. doi: 10.1038/nn.3215, PMID: - DOI - PMC - PubMed
    1. Cullen KE, Taube JS. Our sense of direction: progress, controversies and challenges. Nat Neurosci. (2017) 20:1465–73. doi: 10.1038/nn.4658, PMID: - DOI - PMC - PubMed

LinkOut - more resources