Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2013:3:1692.
doi: 10.1038/srep01692.

Inter-brain synchronization during coordination of speech rhythm in human-to-human social interaction

Affiliations

Inter-brain synchronization during coordination of speech rhythm in human-to-human social interaction

Masahiro Kawasaki et al. Sci Rep. 2013.

Abstract

Behavioral rhythms synchronize between humans for communication; however, the relationship of brain rhythm synchronization during speech rhythm synchronization between individuals remains unclear. Here, we conducted alternating speech tasks in which two subjects alternately pronounced letters of the alphabet during hyperscanning electroencephalography. Twenty pairs of subjects performed the task before and after each subject individually performed the task with a machine that pronounced letters at almost constant intervals. Speech rhythms were more likely to become synchronized in human-human tasks than human-machine tasks. Moreover, theta/alpha (6-12 Hz) amplitudes synchronized in the same temporal and lateral-parietal regions in each pair. Behavioral and inter-brain synchronizations were enhanced after human-machine tasks. These results indicate that inter-brain synchronizations are tightly linked to speech synchronizations between subjects. Furthermore, theta/alpha inter-brain synchronizations were also found in subjects while they observed human-machine tasks, which suggests that the inter-brain synchronization might reflect empathy for others' speech rhythms.

PubMed Disclaimer

Figures

Figure 1
Figure 1. Experimental setup of alternating speech tasks.
(A) Alternating speech tasks between two human subjects (human–human) and one subject and a machine (human–machine). (B) Schematic illustration of the experiments. Each subject completed 14 sessions comprising two pre-machine human–human sessions, 10 successive human–machine sessions [five voices (electronic, male, female, partner's and subject's) at two paces (fixed and random)], and two post-machine human–human sessions. In each session, the subjects participated in alternating speech for 70 seconds. After each session, each subject rated the subjective qualities of the alternating speech (Q).
Figure 2
Figure 2. Voice data and analyses from alternating speech tasks.
Representative examples of voice data captured by a stereo recorder and spectrogram from the FFT analyses for the data. The durations of the subject's and partner's voices and the intervals between the voices (dashed lines) were dissociated and analyzed.
Figure 3
Figure 3
(A) Averaged data and representative durations for each letter and (B) the intervals between each letter from one human–human pair before and after participating in the human–machine tasks.(C) Subject-averaged correlations and (D) differences in the durations and intervals between the subjects under each condition.
Figure 4
Figure 4. Frequency amplitudes and associated analyses.
(A) Subject-, time-, and channel-averaged frequency amplitudes of subjects who participate in human–human (red) or human–machine (green) alternating speech tasks or subjects who observe their partner participating in the speech task with the machine (blue). (B) Topographic colored scalp maps of the P values of the theta/alpha (6–12 Hz) amplitudes of the differences between the tasks and the inter-trial interval (ITI) of human–human and human–machine tasks and of differences between the human–human and human–machine tasks (left and central). Scalp maps of the P values of the theta/alpha amplitudes for the differences between pre- and post-machine human–human tasks (right).
Figure 5
Figure 5
(A) Examples of theta/alpha amplitudes on the temporal electrodes (T7) of two subjects and their time-course cross-correlation coefficients during the human–human and human–machine tasks (top and middle).Examples of averaged cross-correlation coefficients for human–human and human–machine tasks (bottom). (B) Scalp maps of the P values, which show significant correlations between the two subjects' theta/alpha amplitudes during the human–human tasks (left) and differences between the correlation coefficients in the pre- and post-machine human–human tasks (right). (C) Scalp maps of the P values, which show significant correlations between the two subjects' theta/alpha amplitudes during the human–machine tasks. (D) Subject-averaged correlation coefficients on the temporal (T7) and lateral parietal (CP2) electrodes during the human–human and human–machine tasks. (E) Scatter plots between the inter-brain correlation coefficients on the temporal (T7) electrodes and the correlation coefficients of the duration and interval of the speech rhythms between the subjects.
Figure 6
Figure 6. Analyses of the speech rhythms among the voice types.
(A) Subject-averaged correlations and differences in the durations of the voices and intervals between the voices during the human–machine tasks; five voices were used [electronic (a), male (b), female (c), partner's (d) and subject's (e) voices]. (B) Scalp maps of the P values, which show significant main effects of the voices for correlations between the two subjects' theta/alpha amplitudes under machine conditions (left). Subject-averaged correlation coefficients on the temporal (T7) and lateral parietal (CP2) electrodes for each machine voice.

References

    1. Giles H., Coupland N. & Coupland J. Accommodation theory: communication, context, and consequence. In Giles, H., Coupland, J., Coupland, N. (eds.) Contexts of Accommodation: Developments in Applied Sociolinguistics. (New York: Cambridge University Press, 1991).
    1. Kelso J. A. S. Dynamic Patterns: The Self-Organization of Brain and Behavior. (Cambridge: MIT Press, 1995).
    1. Néda Z. et al. The sound of many hands clapping - tumultuous applause can transform itself into waves of synchronized clapping. Nature 403, 849–850 (2000). - PubMed
    1. de Rugy A., Salesse R., Oullier O. & Temprado J. J. A neuro-mechanical model for interpersonal coordination. Biol. Cybern. 94, 427–443 (2006). - PubMed
    1. Richardson M. J., Marsh K. L. & Schmidt R. C. Effects of visual and verbal interaction on unintentional interpersonal coordination. J. Exp. Psychol. Human 31, 62–79 (2005). - PubMed

Publication types