Representational Similarity Analysis for Tracking Neural Correlates of Haptic Learning on a Multimodal Device
- PMID: 37556331
- PMCID: PMC10605963
- DOI: 10.1109/TOH.2023.3303838
Representational Similarity Analysis for Tracking Neural Correlates of Haptic Learning on a Multimodal Device
Abstract
A goal of wearable haptic devices has been to enable haptic communication, where individuals learn to map information typically processed visually or aurally to haptic cues via a process of cross-modal associative learning. Neural correlates have been used to evaluate haptic perception and may provide a more objective approach to assess association performance than more commonly used behavioral measures of performance. In this article, we examine Representational Similarity Analysis (RSA) of electroencephalography (EEG) as a framework to evaluate how the neural representation of multifeatured haptic cues changes with association training. We focus on the first phase of cross-modal associative learning, perception of multimodal cues. A participant learned to map phonemes to multimodal haptic cues, and EEG data were acquired before and after training to create neural representational spaces that were compared to theoretical models. Our perceptual model showed better correlations to the neural representational space before training, while the feature-based model showed better correlations with the post-training data. These results suggest that training may lead to a sharpening of the sensory response to haptic cues. Our results show promise that an EEG-RSA approach can capture a shift in the representational space of cues, as a means to track haptic learning.
Figures











References
-
- Dunkelberger N, Sullivan JL, Bradley J, Manickam I, Dasarathy G, Baraniuk R, and O’Malley MK, “A multisensory approach to present phonemes as language through a wearable haptic device,” IEEE Transactions on Haptics, vol. 14, no. 1, pp. 188–199, 2021. - PubMed
-
- Reed CM, Tan HZ, Jiao Y, Perez ZD, and Wilson EC, “Identification of words and phrases through a phonemic-based haptic display: Effects of inter-phoneme and inter-word interval durations,” ACM transactions on applied perception, vol. 18, no. 3, pp. 1–22, 2021.
-
- Reed CM, Tan HZ, Perez ZD, Wilson EC, Severgnini FM, Jung J, Martinez JS, Jiao Y, Israr A, Lau F, Klumb K, Turcott R, and Abnousi F, “A phonemic-based tactile display for speech communication,” IEEE Transactions on Haptics, vol. 12, no. 1, pp. 2–17, 2019. - PubMed
-
- Luzhnica G, Veas E, and Pammer V, “Skin Reading: Encoding Text in a 6-Channel Haptic Display,” in Proceedings of the International Symposium on Wearable Computers, 2016, pp. 148–155.
-
- Zhao S, Israr A, Lau F, and Abnousi F, “Coding tactile symbols for phonemic communication,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ser. CHI ‘18. New York, NY, USA: Association for Computing Machinery, 2018, p. 1–13.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources