Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Comparative Study
. 2010 Oct;31(10):1556-69.
doi: 10.1002/hbm.20958.

Cortical responses to consciousness of schematic emotional facial expressions: a high-resolution EEG study

Affiliations
Comparative Study

Cortical responses to consciousness of schematic emotional facial expressions: a high-resolution EEG study

Claudio Babiloni et al. Hum Brain Mapp. 2010 Oct.

Abstract

Is conscious perception of emotional face expression related to enhanced cortical responses? Electroencephalographic data (112 channels) were recorded in 15 normal adults during the presentation of cue stimuli with neutral, happy or sad schematic faces (duration: "threshold time" inducing about 50% of correct recognitions), masking stimuli (2 s), and go stimuli with happy or sad schematic faces (0.5 s). The subjects clicked left (right) mouse button in response to go stimuli with happy (sad) faces. After the response, they said "seen" or "not seen" with reference to previous cue stimulus. Electroencephalographic data formed visual event-related potentials (ERPs). Cortical sources of ERPs were estimated by LORETA software. Reaction time to go stimuli was generally shorter during "seen" than "not seen" trials, possibly due to covert attention and awareness. The cue stimuli evoked four ERP components (posterior N100, N170, P200, and P300), which had similar peak latency in the "not seen" and "seen" ERPs. Only N170 amplitude showed differences in amplitude in the "seen" versus "not seen" ERPs. Compared to the "not seen" ERPs, the "seen" ones showed prefrontal, premotor, and posterior parietal sources of N170 higher in amplitude with the sad cue stimuli and lower in amplitude with the neutral and happy cue stimuli. These results suggest that nonconscious and conscious processing of schematic emotional facial expressions shares a similar temporal evolution of cortical activity, and conscious processing induces an early enhancement of bilateral cortical activity for the schematic sad facial expressions (N170).

PubMed Disclaimer

Figures

Figure 1
Figure 1
Any possible pictures for each stimulus in a typical trial. The sequence of cue stimuli and subsequent target stimuli in the second phase of the experimental procedure. The sequence of the visual stimuli was as follows: (i) background stimulus lasting 5.5 s; (ii) cue stimulus “emotional emoticon” (33% for each kind of emotion) appearing at the center of the monitor for the threshold time; (iii) background stimulus lasting about 2 s; (iv) go (target) stimulus lasting 0.5 s. The go stimulus was a green “emoticon” with happy or sad as face expression (50% of probability), which appeared at the center of the monitor. The subjects had to press the left mouse button if the target stimulus had happy face expression, whereas they had to press the right mouse button if the target stimulus had sad face expression. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]
Figure 2
Figure 2
Mean and standard error (N = 15) of reaction time for the statistically significant [F(2,28) = 4.62; P < 0.0185] interaction among the factors Condition (“not seen” and “seen”), Cue stimulus emotion (neutral, happy, and sad), and Go stimulus emotion (happy and sad). Planned post‐hoc comparisons showed that (i) the reaction time to the happy go stimuli was longer in the “not seen” than “seen” trials with the neutral (P < 0.00002), happy (P < 0.00002), and sad (P < 0.015) cue stimuli; and (ii) the reaction time to the sad target stimuli was faster in the “seen” than “not seen” trials with the happy cue stimuli (P < 0.02). Compared to the nonconscious perception of the emotional schematic face expressions, the conscious one was associated with faster reaction time to the target stimuli as a possible effect of covert attention and awareness. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]
Figure 3
Figure 3
Grand average (N = 15) waveforms of the “not seen” and “seen” ERPs for the neutral, happy, and sad cue stimuli (i.e., schematic facial expressions). These ERPs refer to representative midline electrodes (Fz, Cz, Pz, and Oz). The ERP waveforms disclosed a sequence of negative–positive peaks, namely N100, N170, P200, and P300 (N = negative, P = positive, post‐stimulus peak latency in milliseconds). These components were quite similar in shape and amplitude in both “not seen” and “seen” ERPs. As an exception, the N170 amplitude was higher in the “seen” than in the “not seen” trials, especially at posterior electrodes. This component was, hence, selected for further analyses.
Figure 4
Figure 4
Grand average (N = 15) of the LORETA N170 source solutions averaged across all subjects, conditions (“not seen” and “seen”), and emotional expressions of the cue stimuli (neutral, happy, and sad). The most activated BAs for the frontal, central, parietal, occipital, and temporal cortical lobes of interest were the BA 9, BA 6M, BA 7, BA 17, and BA 37, respectively. These BAs were used for ANOVAs comparing LORETA source amplitude. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]
Figure 5
Figure 5
Grand average (N = 15) of LORETA solutions separately modeling the N170 sources relative to neutral, happy, and sad cue stimuli for the “not seen” and “seen” trials. The amplitude of the central (BA 6) and parietal (BA 7) N170 sources appeared to be markedly higher in the “seen” with respect to the “not seen” trials. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]
Figure 6
Figure 6
Individual LORETA solutions for the cortical lobes of interest were used as an input for the ANOVAs. For the frontal lobe (BA 9), the ANOVA of the LORETA N170 source solutions showed a statistically significant interaction [F(2,28) = 3.25; P < 0.0536] between the factors Condition and Cue stimulus emotion. Top graph illustrates the mean LORETA source amplitude relative to this interaction. The amplitude of the LORETA solutions for the sad cue stimuli was higher in the “seen” than in the “not seen” ERPs, whereas the opposite was true for the happy cue stimuli. For the central region (BA 6M), the ANOVA pointed to a statistically significant interaction between the factors Conditions, Hemisphere, and Cue stimulus emotion [F(2,28) = 4.37; P < 0.0223]. Middle graph illustrates the mean LORETA source amplitude relative to this interaction. For the left hemisphere, the amplitude of the LORETA N170 source solutions was higher in the “seen” than in the “not seen” trials for the sad cue stimuli (P < 0.00003); the opposite was true for the happy (P < 0.03) and neutral (P < 0.00003) cue stimuli. For the right hemisphere, the amplitude of the LORETA N170 source solutions was higher in the “seen” than in the “not seen” trials for the sad cue stimuli (P < 0.00003); the opposite was true for the neutral cue stimuli (P < 0.00003). For the parietal lobe (BA 7), there was a statistically significant interaction [F(2,28) = 4.16; P < 0.0262] between the factors Condition and Cue stimulus emotion. Bottom graph illustrates mean LORETA source amplitude relative to this interaction. The amplitude of the LORETA solutions for the sad cue stimuli was higher in the “seen” than in the “not seen” ERPs, whereas the opposite was true for the happy cue stimuli. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]
Figure 7
Figure 7
LORETA source solutions relative to the N170 peak for the schematic (emoticon) and real (face) sad facial expressions. Both kinds of visual stimuli induced an activation of ventral occipital and temporal cortical areas in the typical regions involved in the processing of visual stimuli illustrating human faces and in the generation of N170. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]

Similar articles

Cited by

References

    1. Babiloni F, Babiloni C, Fattorini L, Carducci F, Onorati P, Urbano A ( 1995): Performances of surface Laplacian estimators: A study of simulated and real scalp potential distributions. Brain Topogr 8: 35–45. - PubMed
    1. Babiloni C, Vecchio F, Miriello M, Romani GL, Rossini PM ( 2006a): Visuo‐spatial consciousness and parieto‐occipital areas: A high‐resolution EEG study. Cereb Cortex 16: 37–46. - PubMed
    1. Babiloni C, Vecchio F, Rossi S, De Capua A, Bartalini S, Ulivelli M, Rossini PM ( 2007): Human ventral parietal cortex plays a functional role on visuospatial attention and primary consciousness. A repetitive transcranial magnetic stimulation study. Cereb Cortex 17: 1486–1492. - PubMed
    1. Babiloni C, Vecchio F, Iacoboni M, Buffo P, Eusebi F, Rossini PM ( 2009): Cortical sources of visual evoked potentials during consciousness of executive processes. Hum Brain Mapp 30: 998–1013. - PMC - PubMed
    1. Balconi M, Lucchiari C ( 2005): In the face of emotions: Event‐related potentials in supraliminal and subliminal face expression recognition. Genet Soc Gen Psychol Monogr 131: 41–69. - PubMed

Publication types