Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2014 Dec 9:8:988.
doi: 10.3389/fnhum.2014.00988. eCollection 2014.

Automatic processing of abstract musical tonality

Affiliations

Automatic processing of abstract musical tonality

Inyong Choi et al. Front Hum Neurosci. .

Abstract

Music perception builds on expectancy in harmony, melody, and rhythm. Neural responses to the violations of such expectations are observed in event-related potentials (ERPs) measured using electroencephalography. Most previous ERP studies demonstrating sensitivity to musical violations used stimuli that were temporally regular and musically structured, with less-frequent deviant events that differed from a specific expectation in some feature such as pitch, harmony, or rhythm. Here, we asked whether expectancies about Western musical scale are strong enough to elicit ERP deviance components. Specifically, we explored whether pitches inconsistent with an established scale context elicit deviant components even though equally rare pitches that fit into the established context do not, and even when their timing is unpredictable. We used Markov chains to create temporally irregular pseudo-random sequences of notes chosen from one of two diatonic scales. The Markov pitch-transition probabilities resulted in sequences that favored notes within the scale, but that lacked clear melodic, harmonic, or rhythmic structure. At the random positions, the sequence contained probe tones that were either within the established scale or were out of key. Our subjects ignored the note sequences, watching a self-selected silent movie with subtitles. Compared to the in-key probes, the out-of-key probes elicited a significantly larger P2 ERP component. Results show that random note sequences establish expectations of the "first-order" statistical property of musical key, even in listeners not actively monitoring the sequences.

Keywords: ERP; P2; markov-chain; music; tonality.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Experimental design. (A) A session structure. Experimental and control blocks were separated by one-minute silent break and alternately presented three times each. Each block went for 5.25 min. (B) Eight pitches presented during experimental blocks. Except G#, they all belonged to C major whole-tone diatonic scale. (C) Seven pitches presented during control blocks. They together made C# major scale.
Figure 2
Figure 2
Comparisons of ERP time courses in different contexts for G# (out of key in C major context, but in key in C# major context) and F (in key for both contexts). (A) Comparing grand-average ERPs to the G# tone in C major context (where the G# is an out-of-key note) and C# major context (where the G# is an in-key note). Scalp topographies are computed at 97.7 ms, 199.2 ms, and 339.9 ms, respectively from left to right, correspondent to the timings of N1, P2, and N2 components (vertical dashed lines in the Fz axis indicate the three times). (B) Comparing grand-average ERPs to the F tone in C major context and C# major context. Note that F is an in-key note in both contexts. Scalp topographies show N1, P2, and N2 components at 97.7 ms, 199.2 ms, and 339.9 ms, respectively. (C) P-value time courses from Wilcoxon ranked-sign test for comparing instantaneous amplitudes of ERP pairs (blue lines). H (represented by red line) is 1 when p < 2.44 × 10−4 (= 0.05/205, 205 is number of time-samples under tests), and 0 otherwise. The period when H = 1 is also represented in panel A, denoted by blue shades and asterisks.
Figure 3
Figure 3
Comparison of peak P2 ERP amplitudes for G# (out of key in C major context, but in key in C# major context) and F (in key for both contexts). (A) Comparing individual listeners’ peak ERP amplitudes measured from Cz ERPs. Left panels show baseline-to-P2 peak amplitudes, while right panels show N1-to-P2 amplitudes. Each dot represents data from an individual subject. Lines connect individual subject results in the different contexts. Red lines are used for subjects whose amplitude was larger in the C major context, while blue lines are used for subjects whose amplitude was smaller in the C major context. Wilcoxon ranked-sign tests were performed; there was a statistically significant difference in the two contexts for the G#-elicited P2 and N1-to-P2 amplitudes; no significant difference was found from F-elicited ERP amplitudes. (B) The same statistical tests were performed for all 32 electrode locations. The resulting p-values are shown on scalp maps. Responses to G# were significantly affected by context in frontal-central sensor locations, but responses to F were not.
Figure 4
Figure 4
Comparing P2 amplitudes from the averages of the first and second-half epochs to the G# (out of key in C major context, but in key in C# major context). Baseline-to-P2 amplitudes for Cz from the averages of the first and the second half epochs (left and right panels, respectively). A significant difference in the P2 amplitude between incongruent and congruent contexts was found from the first-half average (left), but not from the second-half average (right).
Figure 5
Figure 5
Correlation between the musical training duration and the P2 amplitude for the response to out-of-key G#.

References

    1. Alho K., Tervaniemi M., Huotilainen M., Lavikainen J., Tiitinen H., Ilmoniemi R. J., et al. . (1996). Processing of complex sounds in the human auditory cortex as revealed by magnetic brain responses. Psychophysiology 33, 369–375. 10.1111/j.1469-8986.1996.tb01061.x - DOI - PubMed
    1. Altmann C. F., Nakata H., Noguchi Y., Inui K., Hoshiyama M., Kaneoke Y., et al. . (2008). Temporal dynamics of adaptation to natural sounds in the human auditory cortex. Cereb. Cortex 18, 1350–1360. 10.1093/cercor/bhm166 - DOI - PubMed
    1. Arnal L. H., Giraud A.-L. (2012). Cortical oscillations and sensory predictions. Trends Cogn. Sci. 16, 390–398. 10.1016/j.tics.2012.05.003 - DOI - PubMed
    1. Benjamini Y., Hochberg Y. (1995). Controlling the false discovery rate: a practical and powerful approach to multiple testing. J. R. Stat. Soc. Ser. B 57, 289–300.
    1. Besson M., Faïta F. (1995). An event-related potential (ERP) study of musical expectancy: comparison of musicians with nonmusicians. J. Exp. Psychol. Hum. Percept. Perform. 21, 1278–1296 10.1037//0096-1523.21.6.1278 - DOI

LinkOut - more resources