Anatomically distinct cortical tracking of music and speech by slow (1-8Hz) and fast (70-120Hz) oscillatory activity
- PMID: 40341725
- PMCID: PMC12061428
- DOI: 10.1371/journal.pone.0320519
Anatomically distinct cortical tracking of music and speech by slow (1-8Hz) and fast (70-120Hz) oscillatory activity
Abstract
Music and speech encode hierarchically organized structural complexity at the service of human expressiveness and communication. Previous research has shown that populations of neurons in auditory regions track the envelope of acoustic signals within the range of slow and fast oscillatory activity. However, the extent to which cortical tracking is influenced by the interplay between stimulus type, frequency band, and brain anatomy remains an open question. In this study, we reanalyzed intracranial recordings from thirty subjects implanted with electrocorticography (ECoG) grids in the left cerebral hemisphere, drawn from an existing open-access ECoG database. Participants passively watched a movie where visual scenes were accompanied by either music or speech stimuli. Cross-correlation between brain activity and the envelope of music and speech signals, along with density-based clustering analyses and linear mixed-effects modeling, revealed both anatomically overlapping and functionally distinct mapping of the tracking effect as a function of stimulus type and frequency band. We observed widespread left-hemisphere tracking of music and speech signals in the Slow Frequency Band (SFB, band-passed filtered low-frequency signal between 1-8Hz), with near zero temporal lags. In contrast, cortical tracking in the High Frequency Band (HFB, envelope of the 70-120Hz band-passed filtered signal) was higher during speech perception, was more densely concentrated in classical language processing areas, and showed a frontal-to-temporal gradient in lag values that was not observed during perception of musical stimuli. Our results highlight a complex interaction between cortical region and frequency band that shapes temporal dynamics during processing of naturalistic music and speech signals.
Copyright: © 2025 Osorio, Assaneo. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Conflict of interest statement
The authors have declared that no competing interests exist.
Figures





Similar articles
-
Temporal Structure of Music Improves the Cortical Encoding of Speech.Hum Brain Mapp. 2025 Apr 1;46(5):e70199. doi: 10.1002/hbm.70199. Hum Brain Mapp. 2025. PMID: 40129256 Free PMC article.
-
Speech and music recruit frequency-specific distributed and overlapping cortical networks.Elife. 2024 Jul 22;13:RP94509. doi: 10.7554/eLife.94509. Elife. 2024. PMID: 39038076 Free PMC article.
-
Human neuromagnetic steady-state responses to amplitude-modulated tones, speech, and music.Ear Hear. 2014 Jul-Aug;35(4):461-7. doi: 10.1097/AUD.0000000000000033. Ear Hear. 2014. PMID: 24603544 Free PMC article.
-
Neural sensitivity to statistical regularities as a fundamental biological process that underlies auditory learning: the role of musical practice.Hear Res. 2014 Feb;308:122-8. doi: 10.1016/j.heares.2013.08.018. Epub 2013 Sep 12. Hear Res. 2014. PMID: 24035820 Review.
-
Music perception: information flow within the human auditory cortices.Adv Exp Med Biol. 2014;829:293-303. doi: 10.1007/978-1-4939-1782-2_15. Adv Exp Med Biol. 2014. PMID: 25358716 Review.
References
-
- Lerdahl F, Jackendoff R. An overview of hierarchical structure in music. Music Percept: An Interdiscip J. 1983;1(2):229–52.
MeSH terms
LinkOut - more resources
Full Text Sources