Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Aug 12;26(9):107571.
doi: 10.1016/j.isci.2023.107571. eCollection 2023 Sep 15.

Topographic representation of visually evoked emotional experiences in the human cerebral cortex

Affiliations

Topographic representation of visually evoked emotional experiences in the human cerebral cortex

Changde Du et al. iScience. .

Abstract

Affective neuroscience seeks to uncover the neural underpinnings of emotions that humans experience. However, it remains unclear whether an affective space underlies the discrete emotion categories in the human brain, and how it relates to the hypothesized affective dimensions. To address this question, we developed a voxel-wise encoding model to investigate the cortical organization of human emotions. Results revealed that the distributed emotion representations are constructed through a fundamental affective space. We further compared each dimension of this space to 14 hypothesized affective dimensions, and found that many affective dimensions are captured by the fundamental affective space. Our results suggest that emotional experiences are represented by broadly spatial overlapping cortical patterns and form smooth gradients across large areas of the cortex. This finding reveals the specific structure of the affective space and its relationship to hypothesized affective dimensions, while highlighting the distributed nature of emotional representations in the cortex.

Keywords: Cognitive neuroscience; Neuroscience; Sensory neuroscience.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

None
Graphical abstract
Figure 1
Figure 1
Schematic diagrams of the experiment and analysis methods (A) Example screenshots of four video stimuli with corresponding emotion category ratings. (B) A total of 2196 video stimuli were presented to five subjects while brain activity was measured using fMRI. (C) A voxel-wise encoding model was used to predict brain activity as a linear weighted sum of the emotion category ratings, with L2-regularization. (D) The estimated affective space was compared to hypothesized affective dimensions.
Figure 2
Figure 2
Model prediction performance across the cortical surface (A) Cortical map of model prediction accuracy on both inflated and flattened cortical sheets of S03 in terms of significantly predicted voxels (p<0.01, FDR-corrected), in which well-predicted voxels appear yellow. (B) Proportion of significantly predicted voxels in representative cortical regions for S03. (abbreviations: V, visual; LO, lateral occipital; TPJ, temporo-parietal junction; IPL, inferior parietal lobule; PC, precuneus; STS, superior temporal sulcus; TE, temporal area; MTC, medial temporal cortex; DLPFC/DMPFC/VMPFC, dorsolateral/dorsomedial/ventromedial prefrontal cortex; ACC, anterior cingulate cortex; and OFC, orbitofrontal cortex). (C) Histogram of prediction accuracy for all cortical voxels for S03. The red line indicates the threshold for significant prediction (p<0.01, FDR-corrected).
Figure 3
Figure 3
Disentangling the respective contributions of visual, semantic, and emotion features in voxel-wise encoding (A) Comparison of model prediction accuracies with a 2D histogram. All voxels are represented in this histogram, where the diagonal corresponds to identical prediction accuracy for both models. A distribution deviating from the diagonal means that one model has better predictive performance than the other. (B) Differences in prediction accuracies of emotion and visual features are projected on the cortical map, where voxels with higher prediction accuracy of visual features are shown in blue, and voxels with higher prediction accuracy of emotion features are shown in red. (C) Emotion feature against semantic feature, where voxels with higher prediction accuracy of semantic features are shown in blue, and voxels with higher prediction accuracy of emotion features are shown in red. Results for other subjects are shown in Figure S4.
Figure 4
Figure 4
Fundamental affective space and cortical mapping (A) Singluar values of covariance matrix when conducting PCA and Q-Q plot of the observed singular values versus the quantiles obtained from the inverse CDF of the Marčenko–Pastur distribution. (B) Cross subject consistency of the fundamental affective space. The blue bar indicates individual to group correlations which was calculated with individual affective space and sub-group affective space for each subject. The yellow bar indicates individual to emotion correlations which was calculated with individual affective space for each subject and behavioral semantic space (see STAR Methods). Error bars show the minimum and maximum correlation coefficients over the 1000 bootstrap samples (p<0.001). (C) Affective space constructed from brain activity. The 34 emotion categories were organized based on their coefficients on the first and second PCs. The color of each marker is determined by an RGB colormap based on the category coefficients in the top three PCs. The position of each marker is determined by the coefficients of PC1 and PC2. This ensures that categories that are represented similarly in the brain appear near each other. (D) Affective space constructed from behavioral data. (E) Cortical maps of the fundamental affective space on both inflated and flattened cortical sheets of S03. RGB Color is determined by the voxel coefficients in the top three PCs. (F) Projection of voxel coefficients onto the individual PC for S03.
Figure 5
Figure 5
Interpretation of the recovered fundamental affective space (A) Pearson’s correlation coefficients between the top four PCs and 14 hypothesized affective dimensions are compared. (B) The emotion category projections corresponding to the top four PCs. For each PC, the most related affective dimension is shown at the top: arousal for PC1 (r=0.30), approach for PC2 (negative correlation, r=0.56), attention for PC3 (negative correlation, r=0.62), and commitment for PC4 (r=0.34). Emotion categories that match with the PC’s best explained affective dimension in terms of polarity are marked in red. (C) Examples of video screenshots for arousal, approach, attention, and commitment according to the polarity of them measured by a 9-point Likert scale.
Figure 6
Figure 6
Smoothness of the cortical maps (A) Qualitative comparison of randomly constructed space (gray) and the estimated affective space (blue) for typical subject (S03). (B) Quantitative comparison of smoothness. Gray error bars show 95% confidence intervals for the random space results. For adjacent voxels (distance 1) and voxels separated by one intermediated voxel (distance 2), smoothness of estimated affective space projections are significantly greater than chance (p<0.01) in all subjects but S05.
Figure 7
Figure 7
Affective gradients on the cortical maps Gradients corresponding to PC1-PC4 were indicated by white arrows and numbers.

Similar articles

Cited by

  • Cerebral topographies of perceived and felt emotions.
    Saarimäki H, Nummenmaa L, Volynets S, Santavirta S, Aksiuto A, Sams M, Jääskeläinen IP, Lahnakoski JM. Saarimäki H, et al. Imaging Neurosci (Camb). 2025 Mar 27;3:imag_a_00517. doi: 10.1162/imag_a_00517. eCollection 2025. Imaging Neurosci (Camb). 2025. PMID: 40800968 Free PMC article.

References

    1. Hamann S. Mapping discrete and dimensional emotions onto the brain: controversies and consensus. Trends Cogn. Sci. 2012;16:458–466. - PubMed
    1. Chikazoe J., Lee D.H., Kriegeskorte N., Anderson A.K. Population coding of affect across stimuli, modalities and individuals. Nat. Neurosci. 2014;17:1114–1122. - PMC - PubMed
    1. Giordano B.L., Whiting C., Kriegeskorte N., Kotz S.A., Gross J., Belin P. The representational dynamics of perceived voice emotions evolve from categories to dimensions. Nat. Hum. Behav. 2021;5:1203–1213. - PMC - PubMed
    1. Vytal K., Hamann S. Neuroimaging support for discrete neural correlates of basic emotions: a voxel-based meta-analysis. J. Cogn. Neurosci. 2010;22:2864–2885. - PubMed
    1. Adolphs R., Tranel D., Damasio H., Damasio A. Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala. Nature. 1994;372:669–672. - PubMed

LinkOut - more resources