Decoding unattended fearful faces with whole-brain correlations: an approach to identify condition-dependent large-scale functional connectivity
- PMID: 22479172
- PMCID: PMC3315448
- DOI: 10.1371/journal.pcbi.1002441
Decoding unattended fearful faces with whole-brain correlations: an approach to identify condition-dependent large-scale functional connectivity
Abstract
Processing of unattended threat-related stimuli, such as fearful faces, has been previously examined using group functional magnetic resonance (fMRI) approaches. However, the identification of features of brain activity containing sufficient information to decode, or "brain-read", unattended (implicit) fear perception remains an active research goal. Here we test the hypothesis that patterns of large-scale functional connectivity (FC) decode the emotional expression of implicitly perceived faces within single individuals using training data from separate subjects. fMRI and a blocked design were used to acquire BOLD signals during implicit (task-unrelated) presentation of fearful and neutral faces. A pattern classifier (linear kernel Support Vector Machine, or SVM) with linear filter feature selection used pair-wise FC as features to predict the emotional expression of implicitly presented faces. We plotted classification accuracy vs. number of top N selected features and observed that significantly higher than chance accuracies (between 90-100%) were achieved with 15-40 features. During fearful face presentation, the most informative and positively modulated FC was between angular gyrus and hippocampus, while the greatest overall contributing region was the thalamus, with positively modulated connections to bilateral middle temporal gyrus and insula. Other FCs that predicted fear included superior-occipital and parietal regions, cerebellum and prefrontal cortex. By comparison, patterns of spatial activity (as opposed to interactivity) were relatively uninformative in decoding implicit fear. These findings indicate that whole-brain patterns of interactivity are a sensitive and informative signature of unattended fearful emotion processing. At the same time, we demonstrate and propose a sensitive and exploratory approach for the identification of large-scale, condition-dependent FC. In contrast to model-based, group approaches, the current approach does not discount the multivariate, joint responses of multiple functional connections and is not hampered by signal loss and the need for multiple comparisons correction.
Conflict of interest statement
The authors have declared that no competing interests exist.
Figures





References
-
- Ewbank MP, Lawrence AD, Passamonti L, Keane J, Peers PV, et al. Anxiety predicts a differential neural response to attended and unattended facial signals of anger and fear. Neuroimage. 2009;44:1144–1151. - PubMed
-
- Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia. 2007;45:174–194. - PubMed
-
- Vuilleumier P, Armony JL, Clarke K, Husain M, Driver J, et al. Neural response to emotional faces with and without awareness: Event-related fMRI in a parietal patient with visual extinction and spatial neglect. Neuropsychologia. 2002;40:2156–2166. - PubMed
-
- Haxby JV, Hoffman EA, Gobbini MI. The distributed human neural system for face perception. Trends Cogn Sci. 2000;4:223–233. - PubMed
-
- Adolphs R, Tranel D, Damasio AR. Dissociable neural systems for recognizing emotions. Brain Cogn. 2003;52:61–69. - PubMed
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources