Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Feb 22;12(2):e0172454.
doi: 10.1371/journal.pone.0172454. eCollection 2017.

What can we learn about beat perception by comparing brain signals and stimulus envelopes?

Affiliations

What can we learn about beat perception by comparing brain signals and stimulus envelopes?

Molly J Henry et al. PLoS One. .

Abstract

Entrainment of neural oscillations on multiple time scales is important for the perception of speech. Musical rhythms, and in particular the perception of a regular beat in musical rhythms, is also likely to rely on entrainment of neural oscillations. One recently proposed approach to studying beat perception in the context of neural entrainment and resonance (the "frequency-tagging" approach) has received an enthusiastic response from the scientific community. A specific version of the approach involves comparing frequency-domain representations of acoustic rhythm stimuli to the frequency-domain representations of neural responses to those rhythms (measured by electroencephalography, EEG). The relative amplitudes at specific EEG frequencies are compared to the relative amplitudes at the same stimulus frequencies, and enhancements at beat-related frequencies in the EEG signal are interpreted as reflecting an internal representation of the beat. Here, we show that frequency-domain representations of rhythms are sensitive to the acoustic features of the tones making up the rhythms (tone duration, onset/offset ramp duration); in fact, relative amplitudes at beat-related frequencies can be completely reversed by manipulating tone acoustics. Crucially, we show that changes to these acoustic tone features, and in turn changes to the frequency-domain representations of rhythms, do not affect beat perception. Instead, beat perception depends on the pattern of onsets (i.e., whether a rhythm has a simple or complex metrical structure). Moreover, we show that beat perception can differ for rhythms that have numerically identical frequency-domain representations. Thus, frequency-domain representations of rhythms are dissociable from beat perception. For this reason, we suggest caution in interpreting direct comparisons of rhythms and brain signals in the frequency domain. Instead, we suggest that combining EEG measurements of neural signals with creative behavioral paradigms is of more benefit to our understanding of beat perception.

PubMed Disclaimer

Conflict of interest statement

Competing Interests: The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Examples of the acoustic manipulations applied to one representative rhythm analyzed in Experiment 1.
The onset structure of the original rhythm (left column) was preserved. Tone duration (middle column) and onset/offset ramp duration (right column) were parametrically varied. After obtaining the amplitude envelopes (middle row) of the stimulus waveforms (top row) via a Hilbert transform, the envelopes were transformed to the stimulus spectra in the frequency domain using a FFT (bottom row). Arrows mark the beat-related frequencies 0.416 Hz (1:12), 1.25 Hz (1:4), 2.5 Hz (1:2), and 5 Hz (1:1).
Fig 2
Fig 2
For each of the 5 stimulus patterns used in [44] (A), frequency-domain amplitudes at beat-related frequencies varied as a function of B) tone duration (onset/offset ramp duration was fixed at 10 ms) and C) onset/offset ramp duration (tone duration was fixed at 200 ms), but the functions were different for each frequency and each rhythm. See Supporting Information S1 Fig for all tested combinations of tone duration and onset/offset ramp duration for all stimulus patterns.
Fig 3
Fig 3. Comparison of envelopes (top) and stimulus spectra (bottom) obtained by using Matlab’s Hilbert function (left) or the MIR-implemented Hilbert transform, shown here for “Pattern 3” (from [44]).
Since the MIR toolbox makes use of time-domain filtering, envelopes are smooth and frequency spectra differ from those obtained from the Matlab Hilbert transform. The most obvious discrepancy is at 2.5 Hz, where there is no energy in the spectrum obtained using the Matlab Hilbert function.
Fig 4
Fig 4. Beat perception does not depend on amplitudes at beat-related frequencies in stimulus spectra.
A. Examples of simple (left column) and complex (right column) rhythms used in Experiment 2a shown together with amplitude spectra. Top: 50-ms tones with 10-ms onset/offset ramps; Middle: 200-ms tones with 10-ms onset/offset ramps; Bottom: 200-ms tones with 100-ms onset/offset ramps. Note the changes to the amplitude spectra that result from changing acoustic stimulus features even for the same onset pattern. B. Beat strength ratings did not change as a function of tone duration (top) or onset/offset ramp duration (bottom) despite their different amplitude spectra. Beat strength ratings did depend on onset pattern (i.e., whether the rhythm was simple or complex, shown in color). C. Beat strength ratings were not significantly correlated with amplitudes at beat-related frequencies 1.25 Hz (top), 2.5 Hz (middle), or 5 Hz (bottom) for any rhythm type (colors same as B). Fisher z-transformed correlation coefficients averaged across participants are shown with standard error of the mean and were not significantly different from zero.
Fig 5
Fig 5. Beat perception differs for rhythms with identical frequency-domain representations.
A. By rotating simple rhythms (i.e., playing them starting from a different point in the sequence), we created two version of rhythms with numerically identical frequency-domain representations. B. Beat strength ratings differed significantly between original and rotated rhythms. Individual participant data are shown in gray, and mean data are overlaid in black.

References

    1. Fitch WT. The biology and evolution of music: A comparative perspective. Cognition. 2006;100:173–215. 10.1016/j.cognition.2005.11.009 - DOI - PubMed
    1. Hagmann CE, Cook RG. Testing meter, rhythm, and tempo discriminations in pigeons. Behavioral processes. 2010;85:99–110. - PubMed
    1. McDermott J, Hauser MD. Nonhuman primates prefer slow tempos but dislike music overall. Cognition. 2007;104:654–68. 10.1016/j.cognition.2006.07.011 - DOI - PubMed
    1. Large EW, Palmer C. Perceiving temporal regularity in music. Cognitive Science. 2002;26:1–37.
    1. Parncutt R. A perceptual model of pulse salience and metrical accents in musical rhythms. Music Perception. 1994;11:409–64.