Neural sensitivity to translational self- and object-motion velocities
- PMID: 38224544
- PMCID: PMC10785198
- DOI: 10.1002/hbm.26571
Neural sensitivity to translational self- and object-motion velocities
Abstract
The ability to detect and assess world-relative object-motion is a critical computation performed by the visual system. This computation, however, is greatly complicated by the observer's movements, which generate a global pattern of motion on the observer's retina. How the visual system implements this computation is poorly understood. Since we are potentially able to detect a moving object if its motion differs in velocity (or direction) from the expected optic flow generated by our own motion, here we manipulated the relative motion velocity between the observer and the object within a stationary scene as a strategy to test how the brain accomplishes object-motion detection. Specifically, we tested the neural sensitivity of brain regions that are known to respond to egomotion-compatible visual motion (i.e., egomotion areas: cingulate sulcus visual area, posterior cingulate sulcus area, posterior insular cortex [PIC], V6+, V3A, IPSmot/VIP, and MT+) to a combination of different velocities of visually induced translational self- and object-motion within a virtual scene while participants were instructed to detect object-motion. To this aim, we combined individual surface-based brain mapping, task-evoked activity by functional magnetic resonance imaging, and parametric and representational similarity analyses. We found that all the egomotion regions (except area PIC) responded to all the possible combinations of self- and object-motion and were modulated by the self-motion velocity. Interestingly, we found that, among all the egomotion areas, only MT+, V6+, and V3A were further modulated by object-motion velocities, hence reflecting their possible role in discriminating between distinct velocities of self- and object-motion. We suggest that these egomotion regions may be involved in the complex computation required for detecting scene-relative object-motion during self-motion.
Keywords: brain mapping; flow parsing; functional magnetic imaging; motion detection; optic flow; virtual reality.
© 2024 The Authors. Human Brain Mapping published by Wiley Periodicals LLC.
Conflict of interest statement
The authors declare no conflicts of interest.
Figures






Similar articles
-
Egomotion-related visual areas respond to active leg movements.Hum Brain Mapp. 2019 Aug 1;40(11):3174-3191. doi: 10.1002/hbm.24589. Epub 2019 Mar 28. Hum Brain Mapp. 2019. PMID: 30924264 Free PMC article.
-
Selectivity to translational egomotion in human brain motion areas.PLoS One. 2013;8(4):e60241. doi: 10.1371/journal.pone.0060241. Epub 2013 Apr 5. PLoS One. 2013. PMID: 23577096 Free PMC article.
-
A common neural substrate for processing scenes and egomotion-compatible visual motion.Brain Struct Funct. 2020 Sep;225(7):2091-2110. doi: 10.1007/s00429-020-02112-8. Epub 2020 Jul 9. Brain Struct Funct. 2020. PMID: 32647918 Free PMC article.
-
Neural substrates underlying the passive observation and active control of translational egomotion.J Neurosci. 2015 Mar 11;35(10):4258-67. doi: 10.1523/JNEUROSCI.2647-14.2015. J Neurosci. 2015. PMID: 25762672 Free PMC article.
-
Human cortical areas underlying the perception of optic flow: brain imaging studies.Int Rev Neurobiol. 2000;44:269-92. doi: 10.1016/s0074-7742(08)60746-1. Int Rev Neurobiol. 2000. PMID: 10605650 Review.
Cited by
-
Visuo-vestibular integration for self-motion: human cortical area V6 prefers forward and congruent stimuli.Exp Brain Res. 2025 May 23;243(6):152. doi: 10.1007/s00221-025-07106-8. Exp Brain Res. 2025. PMID: 40407870 Free PMC article.
-
Common and specific activations supporting optic flow processing and navigation as revealed by a meta-analysis of neuroimaging studies.Brain Struct Funct. 2024 Jun;229(5):1021-1045. doi: 10.1007/s00429-024-02790-8. Epub 2024 Apr 9. Brain Struct Funct. 2024. PMID: 38592557 Free PMC article. Review.
References
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources