Conceptual priming for realistic auditory scenes and for auditory words
- PMID: 24378910
- DOI: 10.1016/j.bandc.2013.11.013
Conceptual priming for realistic auditory scenes and for auditory words
Abstract
Two experiments were conducted using both behavioral and Event-Related brain Potentials methods to examine conceptual priming effects for realistic auditory scenes and for auditory words. Prime and target sounds were presented in four stimulus combinations: Sound-Sound, Word-Sound, Sound-Word and Word-Word. Within each combination, targets were conceptually related to the prime, unrelated or ambiguous. In Experiment 1, participants were asked to judge whether the primes and targets fit together (explicit task) and in Experiment 2 they had to decide whether the target was typical or ambiguous (implicit task). In both experiments and in the four stimulus combinations, reaction times and/or error rates were longer/higher and the N400 component was larger to ambiguous targets than to conceptually related targets, thereby pointing to a common conceptual system for processing auditory scenes and linguistic stimuli in both explicit and implicit tasks. However, fine-grained analyses also revealed some differences between experiments and conditions in scalp topography and duration of the priming effects possibly reflecting differences in the integration of perceptual and cognitive attributes of linguistic and nonlinguistic sounds. These results have clear implications for the building-up of virtual environments that need to convey meaning without words.
Keywords: Auditory environmental scenes; Conceptual priming; Event-Related Potentials (ERP); Explicit and implicit processing; N400.
Copyright © 2013 Elsevier Inc. All rights reserved.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
