Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2025 Aug 6:13:RP99547.
doi: 10.7554/eLife.99547.

Bridging verbal coordination and neural dynamics

Affiliations

Bridging verbal coordination and neural dynamics

Isaïh Schwab-Mohamed et al. Elife. .

Abstract

Our use of language, which is profoundly social in nature, essentially takes place in interactive contexts and is shaped by precise coordination dynamics that interlocutors must observe. Thus, language interaction is highly demanding on fast adjustment of speech production. Here, we developed a real-time coupled-oscillators virtual partner (VP) that allows - by changing the coupling strength parameters - to modulate the ability to synchronise speech with a virtual speaker. Then, we recorded the intracranial brain activity of 16 patients with drug-resistant epilepsy while they performed a verbal coordination task with the VP. More precisely, patients had to repeat short sentences synchronously with the VP. This synchronous speech task is efficient to highlight both the dorsal and ventral language pathways. Importantly, combining time-resolved verbal coordination and neural activity shows more spatially differentiated patterns and different types of neural sensitivity along the dorsal pathway. More precisely, high-frequency activity (HFa) in left secondary auditory regions is highly sensitive to verbal coordinative dynamics, while primary regions are not. Finally, while bilateral engagement was observed in the HFa of the inferior frontal gyrus BA44 - which seems to index online coordinative adjustments that are continuously required to compensate deviation from synchronisation - interpretation of right hemisphere involvement should be approached cautiously due to relatively sparse electrode coverage. These findings illustrate the possibility and value of using a fully dynamic, adaptive, and interactive language task to gather deeper understanding of the subtending neural dynamics involved in speech perception, production as well as their interaction.

Keywords: human; inferior frontal gyrus; mutual adaptation; neurals dynamics; neuroscience; speech coordination; superior temporal gyrus; synchronous speech.

PubMed Disclaimer

Conflict of interest statement

IS, MM, AT, BM, LL, DS No competing interests declared

Figures

Figure 1.
Figure 1.. Anatomical localisation of the stereotactic EEG (sEEG) electrodes for each patient projected in MNI space on the lateral 3D view (top) and on the top view (N =16).
Figure 2.
Figure 2.. Paradigm and coordination indices.
(A) Top: illustration of one trial of the interactive synchronous speech repetition task (orange: virtual partner [VP] speech; blue: participant speech; stimulus papi m’a dit repeated 10 times; only the 10 first seconds are represented). Bottom: the four speech utterances used in the task and the experimental procedure. (B) Speech signals processing stages. The top panel corresponds to the speech envelope, the second to the phase of speech envelope and the third panel to the phase difference between VP and participant speech envelopes, illustrating the coordination dynamics along one trial. (C) Left: distributions of verbal coordination index (phase locking values between VP and participant speech envelopes, for each trial) for all participants (top) and patients. Right: boxplots for control participants (top) and patients showing the trial-averaged verbal coordination index as a function of the VP parameters (in-phase coupling vs coupling with a 180° shift).
Figure 2—figure supplement 1.
Figure 2—figure supplement 1.. Distribution of verbal coordination index for each patient (PLV between patient’s speech and virtual partner [VP] speech).
For each of the 16 patients, this figure depicts the histogram of the coordination index for all trials (in blue) as well as the null distribution (random phase shift) computed using 500 permutations per trial (in red).
Figure 3.
Figure 3.. Power spectrum analyses and correlation with verbal coordination index (VCI; left hemisphere).
Each dot represents a channel where a significant effect was found either on (A) Global activity (Task vs Rest) for each frequency band. The activity is expressed in % of power change compared to resting; or on (B) Behaviour-related activity: r values of the Spearman correlation across trials between the iEEG power and the VCI. (C) The proportion of channels where a significant effect was found: in the task versus rest (orange), in the brain–behaviour correlation (green) or for both comparisons (blue). The percentage in the centre indicates the overall proportion of significant channels from the three categories with respect to the total number of channels.
Figure 3—figure supplement 1.
Figure 3—figure supplement 1.. Power spectrum analyses and correlation with verbal coordination index (VCI; right hemisphere).
Each dot represents a channel where a significant effect was found either on (A) global activity (Task vs Rest) for each frequency band. The activity is expressed in % of power change compared to resting; or on (B) behaviour-related activity: r values of the Spearman correlation across trials between the iEEG power and the VCI. (C) The proportion of channels where a significant effect was found: in the task versus rest (orange), in the brain–behaviour correlation (green) or for both comparisons (blue). The percentage in the centre indicates the overall proportion of significant channels from the three categories with respect to the total number of channels.
Figure 3—figure supplement 2.
Figure 3—figure supplement 2.. Cluster analysis (silhouette score).
Illustration of the mean silhouette scores according to the number of clusters for global activity (in red) and behaviour-related activity (correlation between power changes and coordination index, in blue). The highest silhouette score was obtained for five clusters in the high-frequency activity (HFa) range for behaviour-related activity (framed in full black square). Bottom right: spatial cluster representation for the highest mean silhouette score value in HFa range.
Figure 4.
Figure 4.. Group analysis by regions of interest (ROI) for the left hemisphere.
(A) ROI defined according to the cluster analysis (see Figure 3—figure supplement 2), the delimitation of regions is based on the Brainnetome atlas. (B) For each ROI, boxplots illustrate, in red, channels with significant global power changes (high-frequency activity [HFa], task vs rest) and, in blue, their corresponding r values (correlation between HFa power and verbal coordination index, VCI). Red and blue stars indicate a significant difference from a null distribution. Dots represent independent iEEG channels. The ‘n’ below each ROI specifies the number of patients. STG: superior temporal gyrus; IPL: inferior parietal lobule; IFG: inferior frontal gyrus; BA: Brodmann area.
Figure 4—figure supplement 1.
Figure 4—figure supplement 1.. Group analysis by regions of interest (ROI; right hemisphere).
For each ROI, boxplots illustrate, in red, channels with significant global power changes (high-frequency activity [HFa], task vs rest) and, in blue, their corresponding r values (correlation between HFa power and verbal coordination index, VCI). Red and blue stars indicate a significant difference from a null distribution. Dots represent independent iEEG channels. The ‘n’ below each ROI specifies the number of patients. STG: superior temporal gyrus; IPL: inferior parietal lobule; IFG: inferior frontal gyrus; BA: Brodmann area.
Figure 5.
Figure 5.. Phase-amplitude coupling (PAC) between virtual partner [VP] speech signal or coordination dynamics and high-frequency activity (HFa).
(A) Representation of the increase in PAC expressed in % compared to surrogates mean when using the VP speech (left) or the coordination dynamics (phase difference between VP and patient, right). The shaded (slight blue) area corresponds to the location of the inferior frontal gyrus (IFG) BA44. (B) PAC values for VP (in red) and phase difference (in blue) by regions of interest. Statistical difference between the two types of PAC is calculated using paired Wilcoxon’s test (STG BA41/42: p = 0.01; STG BA22: p = 0.004; inferior parietal lobule [IPL] BA40: p = 0.6; IFG BA44: p = 0.02). Y-axis range has been adjusted to better illustrate the contrast between VP speech and coordination dynamics.
Figure 5—figure supplement 1.
Figure 5—figure supplement 1.. Phase-amplitude coupling (PAC) analyses by region of interest (right hemisphere).
PAC expressed in % compared to surrogates when using as phase the virtual partner [VP] speech (in red) or the coordination dynamic (phase difference between VP and participant, in blue) and as amplitude the high-frequency activity. Statistical difference between the two different types of PAC is calculated using paired Wilcoxon test (STG BA41/42: p < 0.0001; STG BA22: p = 0.9; inferior parietal lobule [IPL] BA40: p = 0.2; inferior frontal gyrus [IFG] BA44: p = 0.01). Y-axis range has been adjusted to better illustrate the contrast between VP speech and coordination dynamic.
Figure 5—figure supplement 2.
Figure 5—figure supplement 2.. Phase-amplitude coupling (PAC) according to the behavioural delay (left hemisphere).
PAC expressed in % compared to surrogates when using as phase the coordination dynamic (phase difference between virtual partner [VP] and participant) and as amplitude the high-frequency activity. Comparison between PAC on trials with negative and positive delays (see ‘Coupling behavioural and neurophysiological data’ section in Materials and methods). Please note that the y-axis range has been adjusted per panel.

Update of

  • doi: 10.1101/2024.04.23.590817
  • doi: 10.7554/eLife.99547.1
  • doi: 10.7554/eLife.99547.2
  • doi: 10.7554/eLife.99547.3

References

    1. Assaneo MF, Ripollés P, Orpella J, Lin WM, de Diego-Balaguer R, Poeppel D. Spontaneous synchronization to speech reveals neural mechanisms facilitating language learning. Nature Neuroscience. 2019;22:627–632. doi: 10.1038/s41593-019-0353-z. - DOI - PMC - PubMed
    1. Basilakos A, Smith KG, Fillmore P, Fridriksson J, Fedorenko E. Functional characterization of the human speech articulation network. Cerebral Cortex. 2018;28:1816–1830. doi: 10.1093/cercor/bhx100. - DOI - PMC - PubMed
    1. Bögels S, Magyari L, Levinson SC. Neural signatures of response planning occur midway through an incoming question in conversation. Scientific Reports. 2015;5:12881. doi: 10.1038/srep12881. - DOI - PMC - PubMed
    1. Bögels S, Levinson SC. The brain behind the response: insights into turn-taking in conversation from neuroimaging. Research on Language and Social Interaction. 2017;50:71–89. doi: 10.1080/08351813.2017.1262118. - DOI
    1. Bradshaw AR, McGettigan C. Convergence in voice fundamental frequency during synchronous speech. PLOS ONE. 2021;16:e0258747. doi: 10.1371/journal.pone.0258747. - DOI - PMC - PubMed

LinkOut - more resources