Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Oct;39(10):3993-4006.
doi: 10.1002/hbm.24226. Epub 2018 Jun 8.

Spatiotemporal dynamics in human visual cortex rapidly encode the emotional content of faces

Affiliations

Spatiotemporal dynamics in human visual cortex rapidly encode the emotional content of faces

Diana C Dima et al. Hum Brain Mapp. 2018 Oct.

Abstract

Recognizing emotion in faces is important in human interaction and survival, yet existing studies do not paint a consistent picture of the neural representation supporting this task. To address this, we collected magnetoencephalography (MEG) data while participants passively viewed happy, angry and neutral faces. Using time-resolved decoding of sensor-level data, we show that responses to angry faces can be discriminated from happy and neutral faces as early as 90 ms after stimulus onset and only 10 ms later than faces can be discriminated from scrambled stimuli, even in the absence of differences in evoked responses. Time-resolved relevance patterns in source space track expression-related information from the visual cortex (100 ms) to higher-level temporal and frontal areas (200-500 ms). Together, our results point to a system optimised for rapid processing of emotional faces and preferentially tuned to threat, consistent with the important evolutionary role that such a system must have played in the development of human social interactions.

Keywords: face perception; magnetoencephalography (MEG); multivariate pattern analysis (MVPA); threat bias.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Experimental paradigm, together with examples of one scrambled image and two face stimuli from the NimStim database, after normalization of Fourier amplitudes [Color figure can be viewed at http://wileyonlinelibrary.com]
Figure 2
Figure 2
(a) Sensors exhibiting significant differences to faces compared to scrambled stimuli (marked with asterisks) at the M170 latency (left) and M220 latency (right; < .01). (b) Timecourses of the evoked responses to neutral faces and scrambled stimuli from right occipital and temporal sensors averaged across subjects (±SEM) [Color figure can be viewed at http://wileyonlinelibrary.com]
Figure 3
Figure 3
Searchlight MVPA analysis of differences in face/scrambled stimulus processing. The left‐hand panel summarizes time‐resolved decoding accuracy (averaged across subjects and 50 ms time windows). The right‐hand figure depicts the proportion of participants achieving above‐chance decoding at each sensor regardless of latency (sensors significant in all subjects and selected for further analysis are marked with asterisks) [Color figure can be viewed at http://wileyonlinelibrary.com]
Figure 4
Figure 4
(a) Decoding accuracy for the face vs scrambled problem in source space with 95% CI and significant decoding time window (black horizontal line, starting at ∼100 ms). (b) Patterns derived from broadband source‐space decoding of faces and scrambled stimuli for 8 key ROIs for the 0–500 ms time window after stimulus onset. (c) Whole‐brain patterns averaged across the first 250 ms after stimulus onset and plotted on the semi‐inflated MNI template brain. Bilateral ROI labels: CA: calcarine cortex; CU: cuneus; LI: lingual gyrus; OS: occipital superior; OM: occipital medial; OI: occipital inferior; PC: precuneus; FG: fusiform gyrus [Color figure can be viewed at http://wileyonlinelibrary.com]
Figure 5
Figure 5
(a) Accuracy traces averaged across participants for each emotion classification problem and each of the four sensor sets (shown in the left‐hand plot). The vertical lines mark the stimulus onset and the shaded areas depict 95% bootstrapped CIs. The horizontal lines represent clusters of at least five significant timepoints (FDR‐corrected < .05). Significant decoding onset is marked with vertical lines (at ∼100 ms for the angry vs. neutral/happy face decoding using occipital sensors). Accuracy traces were smoothed with a 10‐point moving average for visualization only. (b) As above for the sensor set based on the searchlight feature selection method (shown in the left‐hand plot) [Color figure can be viewed at http://wileyonlinelibrary.com]
Figure 6
Figure 6
(a) Accuracy traces averaged across participants for each emotion classification problem in source space using the 84 AAL atlas‐based ROIs (shown in the left‐hand plot). (b) Broadband relevance patterns derived from classifier weights in source space for all three decoding problems, averaged across subjects and 100 ms time windows, baselined and normalized, mapped on the semi‐inflated MNI template brain for time windows between 100 and 500 ms post‐stimulus onset. Patterns visualized here are descriptive and represent each ROI in terms of its relative role in classification across subjects without statistical testing [Color figure can be viewed at http://wileyonlinelibrary.com]
Figure 7
Figure 7
Results of permutation testing of relevance patterns shown in Figure 7 for each decoding problem and time window between 100 and 500 ms. Highlighted ROIs were assigned significant weights (< .05 corrected) [Color figure can be viewed at http://wileyonlinelibrary.com]

Similar articles

Cited by

References

    1. Adolphs, R. (2002). Recognizing emotion from facial expressions: Psychological and neurological mechanisms. Current Opinions in Neurobiology, 1(1), 21–177. - PubMed
    1. Aguado, L. , Valdés‐Conroy, B. , Rodríguez, S. , Román, F. J. , Diéguez‐Risco, T. , & Fernández‐Cahill, M. (2012). Modulation of early perceptual processing by emotional expression and acquired valence of faces: An ERP study. Journal of Psychophysiology, 26(1), 29–41.
    1. Balconi, M. , & Pozzoli, U. (2003). Face‐selective processing and the effect of pleasant and unpleasant emotional expressions on ERP correlates. International Journal of Psychophysiology: Official Journal of the International Organization of Psychophysiology, 49(1), 67–74. - PubMed
    1. Bernstein, M. , & Yovel, G. (2015). Two neural pathways of face processing: A critical evaluation of current models. Neuroscience & Biobehavioral Reviews, 55, 536–546. - PubMed
    1. Brainard, D. H. (1997). The psychophysics toolbox. Spatial Vision, 10(4), 433–436. - PubMed

Publication types

LinkOut - more resources