Directional biases in whole hand motion perception revealed by mid-air tactile stimulation
- PMID: 34280867
- PMCID: PMC8422163
- DOI: 10.1016/j.cortex.2021.03.033
Directional biases in whole hand motion perception revealed by mid-air tactile stimulation
Abstract
Many emerging technologies are attempting to leverage the tactile domain to convey complex spatiotemporal information translated directly from the visual domain, such as shape and motion. Despite the intuitive appeal of touch for communication, we do not know to what extent the hand can substitute for the retina in this way. Here we ask whether the tactile system can be used to perceive complex whole hand motion stimuli, and whether it exhibits the same kind of established perceptual biases as reported in the visual domain. Using ultrasound stimulation, we were able to project complex moving dot percepts onto the palm in mid-air, over 30 cm above an emitter device. We generated dot kinetogram stimuli involving motion in three different directional axes ('Horizontal', 'Vertical', and 'Oblique') on the ventral surface of the hand. Using Bayesian statistics, we found clear evidence that participants were able to discriminate tactile motion direction. Furthermore, there was a marked directional bias in motion perception: participants were both better and more confident at discriminating motion in the vertical and horizontal axes of the hand, compared to those stimuli moving obliquely. This pattern directly mirrors the perceptional biases that have been robustly reported in the visual field, termed the 'Oblique Effect'. These data demonstrate the existence of biases in motion perception that transcend sensory modality. Furthermore, we extend the Oblique Effect to a whole hand scale, using motion stimuli presented on the broad and relatively low acuity surface of the palm, away from the densely innervated and much studied fingertips. These findings highlight targeted ultrasound stimulation as a versatile method to convey potentially complex spatial and temporal information without the need for a user to wear or touch a device.
Keywords: Confidence; Haptics; Human–computer interaction; Somatosensory; Touch.
Copyright © 2021 The Authors. Published by Elsevier Ltd.. All rights reserved.
Figures





References
-
- Abdouni A., Clark R., Georgiou O. 2019 International Conference on Multimodal Interaction. ICMI ’19. Association for Computing Machinery; New York, NY, USA: 2019. Seeing is believing but feeling is the truth: Visualising mid-air haptics in oil baths and lightboxes; pp. 504–505.
-
- Appelle S. Perception and discrimination as a function of stimulus orientation: The “oblique effect” in man and animals. Psychological Bulletin. 1972;78(4):266–278. - PubMed
-
- Azañón E., Stenner M.-P., Cardini F., Haggard P. Dynamic tuning of tactile localization to body posture. Current Biology. 2015;25:512–517. - PubMed
-
- Ball K., Sekuler R. Models of stimulus uncertainty in motion perception. In: Psychological Review. 1980;87(5):435–469. - PubMed
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources