Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2023 Mar;65(2):e22361.
doi: 10.1002/dev.22361.

Signatures of emotional face processing measured by event-related potentials in 7-month-old infants

Affiliations

Signatures of emotional face processing measured by event-related potentials in 7-month-old infants

Özlü Aran et al. Dev Psychobiol. 2023 Mar.

Abstract

The ability to distinguish facial emotions emerges in infancy. Although this ability has been shown to emerge between 5 and 7 months of age, the literature is less clear regarding the extent to which neural correlates of perception and attention play a role in processing of specific emotions. This study's main goal was to examine this question among infants. To this end, we presented angry, fearful, and happy faces to 7-month-old infants (N = 107, 51% female) while recording event-related brain potentials. The perceptual N290 component showed a heightened response for fearful and happy relative to angry faces. Attentional processing, indexed by the P400, showed some evidence of a heightened response for fearful relative to happy and angry faces. We did not observe robust differences by emotion in the negative central (Nc) component, although trends were consistent with previous work suggesting a heightened response to negatively valenced expressions. Results suggest that perceptual (N290) and attentional (P400) processing is sensitive to emotions in faces, but these processes do not provide evidence for a fear-specific bias across components.

Keywords: N290; Nc; P400; emotions; face processing; infant ERP.

PubMed Disclaimer

Conflict of interest statement

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Figures

Figure 1.
Figure 1.. Task Progression.
Figure 2.
Figure 2.. Location of Electrodes Corresponding to Each Region.
Note. Pink circles represent the channels used for the broad analysis of the N290 and P400. The pink shaded channels (38, 39-O2, 43, 44-P8) were used for the focused analysis of the N290 and P400. Green circles identify channels used to for the broad analysis of the Nc. The subset of green shaded channels (4, 7, 16, 21, 41, 51, 54) represent those used for the focused analysis of the Nc.
Figure 3.
Figure 3.. Visualization of ERP Results.
Note. A) Grand average scalp topographical response for time windows characterizing the N290 (left) and P400/Nc (right) across electrodes of interest and emotion categories. White dots represent approximate scalp location of posterior electrodes of interest used for focused ERP analysis of the N290 and P400. Black dots represent approximate scalp location of fronto-central electrodes of interest used for focused ERP analysis of Nc. B) Event-related potential waveforms for the focused analysis of emotion in faces over posterior and fronto-central scalp sites. Shaded grey regions represent approximate time windows of interest used for analyses C) Average ERP response to fearful (F), angry (A), and happy (H) faces for each component of interest in focused regions. An * indicates a significant difference between emotion conditions in bar graphs. Note that values for N290 represent corrected peak amplitudes, whereas both mean and corrected peak amplitudes for P400 are shown. Values for Nc are mean amplitudes (see text for further details). Error bars represent −/+ 1 Standard Error.

Similar articles

Cited by

References

    1. Barrera ME, & Maurer D (1981). The perception of facial expressions by the three-month-old. Child Development, 52(1), 203–206. 10.2307/1129231 - DOI - PubMed
    1. Barrett LF, Lindquist KA, & Gendron M (2007). Language as context for the perception of emotion. Trends in Cognitive Sciences, 11(8), 327–332. 10.1016/j.tics.2007.06.003 - DOI - PMC - PubMed
    1. Bayet L, & Nelson CA (2019). The perception of facial emotion in typical and atypical development. In LoBue V, Pérez-Edgar K, & Buss KA (Eds.), Handbook of Emotional Development (pp. 105–138). Springer International Publishing. 10.1007/978-3-030-17332-6_6 - DOI
    1. Beliveau A & Thorley J (2021). sims: Simulate Data from R or 'JAGS' Code. https://cran.r-project.org/web/packages/sims/index.html
    1. Burris JL, Barry-Anwar RA, & Rivera SM (2017). An eye tracking investigation of attentional biases towards affect in young children. Developmental Psychology, 53(8), 1418–1427. 10.1037/dev0000345 - DOI - PubMed

Publication types

LinkOut - more resources