Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2025 Jul 12;15(1):25202.
doi: 10.1038/s41598-025-07718-8.

Vibrotactile speech cues are associated with enhanced auditory processing in middle and superior temporal gyri

Affiliations

Vibrotactile speech cues are associated with enhanced auditory processing in middle and superior temporal gyri

Alina Schulte et al. Sci Rep. .

Abstract

Combined auditory and tactile stimuli have been found to enhance speech-in-noise perception both in individuals with normal hearing and in those with hearing loss. While behavioral benefits of audio-tactile enhancements in speech understanding have been repeatedly demonstrated, the impact of vibrotactile cues on cortical auditory speech processing remains unknown. Using functional near-infrared spectroscopy (fNIRS) with a dense montage setup, we first identified a region-of-interest highly sensitive to auditory-only speech-in-quiet. In the same region, we then assessed the change in activity ('audio-tactile gains') when presenting speech-in-noise together with a single-channel vibratory signal to the fingertip, congruent with the speech envelope's rate of change. In data from 21 participants with normal hearing, audio-tactile speech elicited on average 20% greater hemodynamic oxygenation changes than auditory-only speech-in-noise within bilateral middle and superior temporal gyri. However, audio-tactile gains did not exceed the sum of the unisensory responses, providing no conclusive evidence of true multisensory integration. Our results support a metamodal theory for the processing of temporal speech features in the middle and superior temporal gyri, providing the first evidence of audio-tactile speech processing in auditory areas using fNIRS. Top-down modulations from somatosensory areas or attention networks likely contributed to the observed audio-tactile gains through temporal entrainment with the speech envelope's rate of change. Further research is needed to understand the neural responses in concordance with their behavioral relevance for speech perception, offering future directions for developing tactile aids for individuals with hearing impairments.

Keywords: Audio-tactile perception; Auditory processing; Functional near-infrared spectroscopy; Multimodal speech; Multisensory processing.

PubMed Disclaimer

Conflict of interest statement

Declarations. Competing interests: The authors declare no competing interests.

Figures

Fig. 1
Fig. 1
Experimental setup and paradigm. (a) Schematic illustration of a participant in the lab receiving auditory stimuli through insert earphones and vibrotactile stimulation on the right index fingertip. (b) One trial of the experimental procedure including auditory, tactile, and visual stimulus presentations. (c) Auditory and vibrotactile signals corresponding to the sentence “The house had a nice garden”. The upper panel shows rectified waveforms of the auditory and tactile stimuli. The tactile stimulus (in orange) is a 230 Hz carrier modulated by the rate of change of the auditory envelope. The bottom panel displays area-filled envelopes of the auditory and tactile speech signals. Note that in the experiment, three sentences were always presented consecutively, forming one stimulus block, as illustrated in (b).
Fig. 2
Fig. 2
Single channel group results. HbO waveform mean amplitudes and beta-values for all experimental conditions are displayed in lateral views of the left and right hemisphere. Responses are displayed as colored tubes, in the respective position of the montage. Thicker tubes represent significant channels after FDR correction.
Fig. 3
Fig. 3
Montage with auditory-ROI. Lateral views of left and right hemispheres. Red circles indicate the positions of sources, and brown circles the positions of detectors. Short-channel detectors are highlighted in yellow. Channels between optodes are displayed in white, except for those forming the auditory ROI, which are displayed in green.
Fig. 4
Fig. 4
ROI results. (a) Grand average waveforms for each task condition in the auditory-ROI. HbO traces are displayed in red, HbR traces in blue, with shaded areas representing 95% confidence intervals. Both chromophores are displayed for illustrative purposes, but only HbO was analyzed statistically. (b) Distributions of HbO waveform mean amplitudes and HbO beta-values in the auditory-ROI are shown for each task condition using split violin plots. (c) Individual audio-tactile gains in the auditory-ROI. Each participant’s audio-tactile gain, as derived from the GLM and the waveform averaging analysis is shown in a bar plot, sorted by size based on the GLM-derive.
Fig. 5
Fig. 5
Auditory-ROI and tactile-ROI. Both ROIs were defined in a data-driven approach based on channels responding most consistent with the largest HbO amplitudes across the participant group. With the exception of one channel (S3-D4), both ROIs revealed distinct channel locations that can be attributed to somatosensory vs. auditory processing. S = Source, D = Detector.

Similar articles

References

    1. Sumby, W. H. & Pollack, I. Visual contribution to speech intelligibility in noise. J. Acoust. Soc. Am.26, 212–215 (1954).
    1. Grant, K. W. & Seitz, P.-F. The use of visible speech cues for improving auditory detection of spoken sentences. J. Acoust. Soc. Am.108, 1197–1208 (2000). - PubMed
    1. Erber, N. P. Auditory-visual perception of speech. J. Speech Hear. Disord.40, 481–492 (1975). - PubMed
    1. Kral, A. & Sharma, A. Crossmodal plasticity in hearing loss. Trends Neurosci.46, 377–393 (2023). - PMC - PubMed
    1. Peelle, J. E. & Davis, M. H. Neural oscillations carry speech rhythm through to comprehension. Front. Psychol. 3, 320 (2012). - PMC - PubMed

LinkOut - more resources