Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2010 Feb 26:4:19.
doi: 10.3389/fnhum.2010.00019. eCollection 2010.

"It's Not What You Say, But How You Say it": A Reciprocal Temporo-frontal Network for Affective Prosody

Affiliations

"It's Not What You Say, But How You Say it": A Reciprocal Temporo-frontal Network for Affective Prosody

David I Leitman et al. Front Hum Neurosci. .

Abstract

Humans communicate emotion vocally by modulating acoustic cues such as pitch, intensity and voice quality. Research has documented how the relative presence or absence of such cues alters the likelihood of perceiving an emotion, but the neural underpinnings of acoustic cue-dependent emotion perception remain obscure. Using functional magnetic resonance imaging in 20 subjects we examined a reciprocal circuit consisting of superior temporal cortex, amygdala and inferior frontal gyrus that may underlie affective prosodic comprehension. Results showed that increased saliency of emotion-specific acoustic cues was associated with increased activation in superior temporal cortex [planum temporale (PT), posterior superior temporal gyrus (pSTG), and posterior superior middle gyrus (pMTG)] and amygdala, whereas decreased saliency of acoustic cues was associated with increased inferior frontal activity and temporo-frontal connectivity. These results suggest that sensory-integrative processing is facilitated when the acoustic signal is rich in affective information, yielding increased activation in temporal cortex and amygdala. Conversely, when the acoustic signal is ambiguous, greater evaluative processes are recruited, increasing activation in inferior frontal gyrus (IFG) and IFG STG connectivity. Auditory regions may thus integrate acoustic information with amygdala input to form emotion-specific representations, which are evaluated within inferior frontal regions.

Keywords: amygdala; auditory cortex; emotion; inferior frontal gyrus; prosody; speech.

PubMed Disclaimer

Figures

Figure 1
Figure 1
fMRI Paradigm. Subjects were placed in a supine position into the scanner and instructed to focus on a central fixation crosshair displayed via a rear-mounted projector [PowerLite 7300 video projector (Epson America, Inc., Long Beach, CA, USA)] and viewed through a head coil-mounted mirror. After sound offset, this crosshair was replaced with a visual prompt containing emoticons representing the four emotion choices and the corresponding response button number. Auditory stimuli were presented through pneumatic headphones and sound presentation occurred between volume collections to minimize any potential impact of scanner noise on stimulus processing.
Figure 2
Figure 2
All stimuli > rest. Activation presented at an uncorrected p < 0.05 threshhold. Grey shadow represents scanned regions of the brain.
Figure 3
Figure 3
Identification performance as a function of acoustic cue saliency levels. (A) Mean performance across all emotion choices; error bars reflect standard error of the mean of the raw data. White dotted line indicates chance performance. (B) Anger: as HF500 increases accuracy increases. (C) Fear: as F0SD decreases accuracy increases. (D) Happiness: as F0SD increases accuracy increases.
Figure 4
Figure 4
All emotions > neutral. A subtraction of neutral activation from all emotions (anger, fear and happiness) indicates activation clusters bilaterally in posterior superior/middle temporal gyrus (pSTG/ pMTG), inferior frontal gyrus (IFG) and orbitofrontal cortex (OFC). The markers in red illustrate differences between this contrast and the subsequent parametric analysis: Arrow = OFC activation; * = thalamic activation; circles = absence of amygdala activation bilaterally.
Figure 5
Figure 5
Cue saliency-correlated activation patterns, by emotion. (A) Correlation with a standardized estimate (ZCUE) of cue saliency across all emotions revealed increased PT, pSTG and pMTG activation as cue saliency increased (red), and conversely, increased bilateral IFG activation as cue saliency decreased (blue). (B) A similar pattern was observed for anger as HF500 increased/decreased. (C) A conjunction analysis of increasing cue saliency (increasing F0SD) for happy and increasing cue saliency (decreasing F0SD) for fear yielded a similar pattern. (D) Uncorrected p < 0.05 maps of F0SD modulated activity for happy (left) and fear (right) indicate activation clusters spanning pSTG, amygdala and IFG. For happiness increasing F0SD (red) is associated with activation increases in pSTG and amygdala while decreasing F0SD (blue)is associated with increasing IFG activation. The reverse pattern is seen for fear, decreasing F0SD is associated with activation increases in pSTG and amygdala, while decreasing F0SD is associated with increasing IFG activation.
Figure 6
Figure 6
Psychophysiological (PPI). This functional connectivity analysis map illustrates the negative interaction between ZCUE and the mean timeseries of IFG seed region (red sphere). This map indicates that functional connectivity between IFG and auditory processing regions is significantly modulated by cue saliency: Decreasing cue saliency increases IFG-STG functional coupling, while increasing cue saliency decreases this coupling.
Figure 7
Figure 7
Control conjunction analyses. Increasing or decreasing F0SD across fear and happiness does not reveal activation in STG, IFG or amygdala at uncorrected p < 0.05 threshold.

Similar articles

Cited by

References

    1. Adams R. B., Janata P. (2002). A comparison of neural circuits underlying auditory and visual object categorization. Neuroimage 16, 361–37710.1006/nimg.2002.1088 - DOI - PubMed
    1. Adolphs R. (2002). Neural systems for recognizing emotion. Curr. Opin. Neurobiol. 12, 169–17710.1016/S0959-4388(02)00301-X - DOI - PubMed
    1. Adolphs R., Tranel D., Damasio H. (2001). Emotion recognition from faces and prosody following temporal lobectomy. Neuropsychology 15, 396–40410.1037/0894-4105.15.3.396 - DOI - PubMed
    1. Bach D. R., Grandjean D., Sander D., Herdener M., Strik W. K., Seifritz E. (2008a). The effect of appraisal level on processing of emotional prosody in meaningless speech. Neuroimage 42, 919–92710.1016/j.neuroimage.2008.05.034 - DOI - PubMed
    1. Bach D. R., Schachinger H., Neuhoff J. G., Esposito F., Di Salle F., Lehmann C., Herdener M., Scheffler K., Seifritz E. (2008b). Rising sound intensity: an intrinsic warning cue activating the amygdala. Cereb. Cortex 18, 145–15010.1093/cercor/bhm040 - DOI - PubMed