Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2016 Feb-Mar;33(1-2):26-47.
doi: 10.1080/02643294.2016.1168791. Epub 2016 Jun 21.

Towards explaining spatial touch perception: Weighted integration of multiple location codes

Affiliations
Review

Towards explaining spatial touch perception: Weighted integration of multiple location codes

Stephanie Badde et al. Cogn Neuropsychol. 2016 Feb-Mar.

Abstract

Touch is bound to the skin - that is, to the boundaries of the body. Yet, the activity of neurons in primary somatosensory cortex just mirrors the spatial distribution of the sensors across the skin. To determine the location of a tactile stimulus on the body, the body's spatial layout must be considered. Moreover, to relate touch to the external world, body posture has to be evaluated. In this review, we argue that posture is incorporated, by default, for any tactile stimulus. However, the relevance of the external location and, thus, its expression in behaviour, depends on various sensory and cognitive factors. Together, these factors imply that an external representation of touch dominates over the skin-based, anatomical when our focus is on the world rather than on our own body. We conclude that touch localization is a reconstructive process that is adjusted to the context while maintaining all available spatial information.

Keywords: Tactile; body posture; localization; multisensory; reference frames.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Integration account of crossing effects in touch localization. (A) Serial account of touch localization. Tactile locations are remapped from a skin-based anatomical reference frame into an external reference frame. Tactile location estimates are exclusively based on these externally coded representations. (B) Integration account of touch localization. Anatomical and external tactile codes exist in parallel. Tactile location estimates are based on the weighted integration of both response codes. (C) Crossing effects on accuracy in three tactile localization tasks. In the temporal order judgment (TOJ) and first touch localization (FTL) tasks, two stimuli were successively applied, one to each hand. Participants were instructed to judge the temporal order of the stimuli and to press the button underneath the hand that received the first touch (TOJ), or to indicate the location of the first touch and ignore the second touch (FTL). In the single touch localization task (STL), participants pressed the button underneath the hand that received a single stimulus. All three tasks were executed with crossed (red) and uncrossed (grey) hands by the same participants. Crossing effects in all three tasks were pairwise correlated. Error bars show standard errors of the mean. (D) Graphical description of the integration model. The probability of localizing a touch to the right hand in a single trial, logit−1(θ), is derived from the stimulus’s weighted anatomical and external left–right response codes (ρanat and ρext). For each participant (i), the weight parameters (ωanat and ωext) were drawn from a population distribution Nanat, σanat) and Next, σext), the parameters of which were concurrently estimated by the model. The individual weights were adjusted to the three different tasks (TOJ, FTL, and STL) by non-individual task context parameters (δanat and δext). Weights varied across individuals (green frame), but the task context parameters did not (blue frame). Crucially, in the integration model none of the free parameters (non-shaded boxes) varied across postures (red frame), that is performance was explained by common weighting for all postures. (E) Goodness of fit of the integration model’s predictions of each participant’s performance. Posterior predictive distributions (grey bars) – that is, frequency distributions of the fraction of “right hand” responses as predicted by the model – are plotted with the observed fractions (red dots). Data from three tactile localization tasks are shown separately for each hand posture and stimulated hand. Each line represents one participant. Figures were adapted from Badde, Heed, et al. (2016). With permission of Springer.
Figure 2.
Figure 2.
Motor effects on gaze-dependent coding of tactile targets. (A) Conditions adapted from Mueller and Fiehler (2014a). Participants fixated one of the fixation lights (white ovals). In the stationary condition, tactile stimuli were applied to one of three possible locations on the left forearm (grey circles), which remained stationary at the target position. In the movement condition, the arm was always moved before and after tactile stimulation, guided by a slider on a rail. Responses were given by pointing with the right hand. (B) Mean horizontal reach errors of the 2 gaze conditions (fixed vs. shifted) and the 2 modes of target presentation (stationary vs. moved) as a function of gaze relative to target. Reach errors were collapsed for the 3 target locations and were averaged across subjects.
Figure 3.
Figure 3.
Pointing to tactile targets. (A) Conditions from Brandes and Heed (2015). In each trial, participants initiated a straight reach. When the hand passed a trigger location (ca. 10 cm into the reach), the participant recieved a visual or tactile stimulus on their uncrossed or crossed feet and had to redirect the reach to this stimulus. (B) Spatial characteristics of the resulting reach trajectories. Single-subject example of mean trajectories; reaches to the left target were flipped to be analysed together with reaches to the right target. Points display single-trial turn points towards the correct goal location for reaches to visual and tactile targets located at uncrossed (light blue/red) or crossed feet (dark blue/red). Dashed line indicates the mean of a subset of reaches that first deviated towards the incorrect side of space (about 15% of reaches, termed “turn-around trajectories”). When turn-around reaches were excluded, the remaining 85% of trials showed trajectories that continued straight and then immediately turned to the correct target. Figure was adapted from Brandes and Heed (2015).
Figure 4.
Figure 4.
Task context effects on the crossing effect in tactile temporal order judgment (TOJ). (A) Procedure and conditions from Badde, Röder, et al. (2015). In each trial, participants perceived two successive tactile stimuli, one to each hand. The vibration frequency of the stimuli varied independently of their location. First, participants performed the TOJ task – that is, they indicated the location of the first stimulus by a button press with the respective hand. Second, participants verbally indicated either the temporal order of the two vibration frequencies or their spatial arrangement. To avoid confounds between left–right responses in the TOJ task and left–right responses in the secondary spatial task, participants reported the colour of the box that the faster stimulus was located in. In Experiment 1, the colours were associated with the hands, accentuating anatomical coding. In Experiment 2, the coloured boxes were attached to one side of the table and, thus, were associated with a side of space, accentuating external coding. (B) Accuracy in the primary TOJ task. Error rates with uncrossed hands (grey circles) were unaffected by the additional judgments. In contrast, error rates with crossed hands (red squares) decreased with temporal additional judgments and when the anatomical reference frame was accentuated, but not when the external reference frame was accentuated. Error bars depict standard errors of the mean. Figure adapted with permission from Badde, Röder, et al. (2015).

Similar articles

Cited by

References

    1. Aglioti S., Smania N., Peru A. Frames of reference for mapping tactile stimuli in brain-damaged patients. Journal of Cognitive Neuroscience. 1999:67–79. doi: 10.1162/089892999563256. - DOI - PubMed
    1. Alais D., Burr D. The ventriloquist effect results from near-optimal bimodal integration. Current Biology. 2004:257–262. http://dx.doi.org/10.1016/j.cub.2004.01.029 - DOI - PubMed
    1. Attneave F. Some informational aspects of visual perception. Psychological Review. 1954:183–193. http://dx.doi.org/10.1037/h0054663 - DOI - PubMed
    1. Avillac M., Denève S., Olivier E., Pouget A., Duhamel J.-R. Reference frames for representing visual and tactile locations in parietal cortex. Nature Neuroscience. 2005:941–949. http://dx.doi.org/10.1038/nn1480 - DOI - PubMed
    1. Azañón E., Camacho K., Soto-Faraco S. Tactile remapping beyond space. European Journal of Neuroscience. 2010:1858–1867. http://dx.doi.org/10.1111/j.1460-9568.2010.07233.x - DOI - PubMed

Publication types

LinkOut - more resources