Emotional speech processing at the intersection of prosody and semantics
- PMID: 23118868
- PMCID: PMC3485297
- DOI: 10.1371/journal.pone.0047279
Emotional speech processing at the intersection of prosody and semantics
Abstract
The ability to accurately perceive emotions is crucial for effective social interaction. Many questions remain regarding how different sources of emotional cues in speech (e.g., prosody, semantic information) are processed during emotional communication. Using a cross-modal emotional priming paradigm (Facial affect decision task), we compared the relative contributions of processing utterances with single-channel (prosody-only) versus multi-channel (prosody and semantic) cues on the perception of happy, sad, and angry emotional expressions. Our data show that emotional speech cues produce robust congruency effects on decisions about an emotionally related face target, although no processing advantage occurred when prime stimuli contained multi-channel as opposed to single-channel speech cues. Our data suggest that utterances with prosodic cues alone and utterances with combined prosody and semantic cues both activate knowledge that leads to emotional congruency (priming) effects, but that the convergence of these two information sources does not always heighten access to this knowledge during emotional speech processing.
Conflict of interest statement
Figures


References
-
- Ekman P (1993) Facial expression and emotion. American Psychologist 48: 384–392. - PubMed
-
- Etcoff NL, Magee JL (1992) Categorical perception of facial expressions. Cognition 44: 227–240. - PubMed
-
- Pell MD (2001) Influence of emotion and focus location on prosody in matched statements and questions. Journal of the Acoustical Society of America 109: 1668–1680. - PubMed
-
- Schirmer A, Kotz SA, Friederici AD (2002) Sex differentiates the role of emotional prosody during word processing. Cognitive Brain Research 14: 228–233. - PubMed
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources