Computations for geometrically accurate visually guided reaching in 3-D space
- PMID: 18217844
- DOI: 10.1167/7.5.4
Computations for geometrically accurate visually guided reaching in 3-D space
Abstract
A fundamental question in neuroscience is how the brain transforms visual signals into accurate three-dimensional (3-D) reach commands, but surprisingly this has never been formally modeled. Here, we developed such a model and tested its predictions experimentally in humans. Our visuomotor transformation model used visual information about current hand and desired target positions to compute the visual (gaze-centered) desired movement vector. It then transformed these eye-centered plans into shoulder-centered motor plans using extraretinal eye and head position signals accounting for the complete 3-D eye-in-head and head-on-shoulder geometry (i.e., translation and rotation). We compared actual memory-guided reaching performance to the predictions of the model. By removing extraretinal signals (i.e., eye-head rotations and the offset between the centers of rotation of the eye and head) from the model, we developed a compensation index describing how accurately the brain performs the 3-D visuomotor transformation for different head-restrained and head-unrestrained gaze positions as well as for eye and head roll. Overall, subjects did not show errors predicted when extraretinal signals were ignored. Their reaching performance was accurate and the compensation index revealed that subjects accounted for the 3-D visuomotor transformation geometry. This was also the case for the initial portion of the movement (before proprioceptive feedback) indicating that the desired reach plan is computed in a feed-forward fashion. These findings show that the visuomotor transformation for reaching implements an internal model of the complete eye-to-shoulder linkage geometry and does not only rely on feedback control mechanisms. We discuss the relevance of this model in predicting reaching behavior in several patient groups.
Similar articles
-
Role of eye, head, and shoulder geometry in the planning of accurate arm movements.J Neurophysiol. 2002 Apr;87(4):1677-85. doi: 10.1152/jn.00509.2001. J Neurophysiol. 2002. PMID: 11929889
-
Accurate planning of manual tracking requires a 3D visuomotor transformation of velocity signals.J Vis. 2012 May 25;12(5):6. doi: 10.1167/12.5.6. J Vis. 2012. PMID: 22637707
-
Influence of initial hand and target position on reach errors in optic ataxic and normal subjects.J Vis. 2007 Jul 17;7(5):8.1-16. doi: 10.1167/7.5.8. J Vis. 2007. PMID: 18217848
-
Spatial transformations for eye-hand coordination.J Neurophysiol. 2004 Jul;92(1):10-9. doi: 10.1152/jn.00117.2004. J Neurophysiol. 2004. PMID: 15212434 Review.
-
Visuomotor transformations for eye-hand coordination.Prog Brain Res. 2002;140:329-40. doi: 10.1016/S0079-6123(02)40060-X. Prog Brain Res. 2002. PMID: 12508600 Review.
Cited by
-
Visual perception of axes of head rotation.Front Behav Neurosci. 2013 Feb 15;7:11. doi: 10.3389/fnbeh.2013.00011. eCollection 2013. Front Behav Neurosci. 2013. PMID: 23919087 Free PMC article.
-
Behavioral investigation on the frames of reference involved in visuomotor transformations during peripheral arm reaching.PLoS One. 2012;7(12):e51856. doi: 10.1371/journal.pone.0051856. Epub 2012 Dec 13. PLoS One. 2012. PMID: 23272180 Free PMC article.
-
Effector-dependent stochastic reference frame transformations alter decision-making.J Vis. 2022 Jul 11;22(8):1. doi: 10.1167/jov.22.8.1. J Vis. 2022. PMID: 35816048 Free PMC article.
-
Computations underlying the visuomotor transformation for smooth pursuit eye movements.J Neurophysiol. 2015 Mar 1;113(5):1377-99. doi: 10.1152/jn.00273.2014. Epub 2014 Dec 4. J Neurophysiol. 2015. PMID: 25475344 Free PMC article.
-
Pointing with the wrist: a postural model for Donders' law.Exp Brain Res. 2011 Jul;212(3):417-27. doi: 10.1007/s00221-011-2747-3. Epub 2011 Jun 4. Exp Brain Res. 2011. PMID: 21643712
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources