Visuo-tactile representation of near-the-body space
- PMID: 15477030
- DOI: 10.1016/j.jphysparis.2004.03.007
Visuo-tactile representation of near-the-body space
Abstract
Here we report findings from neuropsychological investigations showing the existence, in humans, of intersensory integrative systems representing space through the multisensory coding of visual and tactile events. In addition, these findings show that visuo-tactile integration may take place in a privileged manner within a limited sector of space closely surrounding the body surface, i.e., the near-peripersonal space. They also demonstrate that the representation of near-peripersonal space is not static, as objects in the out-of-reach space can be processed as nearer, depending upon the (illusory) visual information about hand position in space, and the use of tools as physical extensions of the reachable space. Finally, new evidence is provided suggesting the multisensory coding of peripersonal space can be achieved through bottom-up processing that, at least in some instances, is not necessarily modulated by more "cognitive" top-down processing, such as the expectation regarding the possibility of being touched. These findings are entirely consistent with the functional properties of multisensory neuronal structures coding near-peripersonal space in monkeys, as well as with behavioral, and neuroimaging evidence for the cross-modal coding of space in normal subjects. This high level of convergence ultimately favors the idea that multisensory space coding is achieved through similar multimodal structures in both humans and non-human primates.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
