Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Mar 1;19(3):8.
doi: 10.1167/19.3.8.

Recall of facial expressions and simple orientations reveals competition for resources at multiple levels of the visual hierarchy

Affiliations

Recall of facial expressions and simple orientations reveals competition for resources at multiple levels of the visual hierarchy

Viljami R Salmela et al. J Vis. .

Abstract

Many studies of visual working memory have tested humans' ability to reproduce primary visual features of simple objects, such as the orientation of a grating or the hue of a color patch, following a delay. A consistent finding of such studies is that precision of responses declines as the number of items in memory increases. Here we compared visual working memory for primary features and high-level objects. We presented participants with memory arrays consisting of oriented gratings, facial expressions, or a mixture of both. Precision of reproduction for all facial expressions declined steadily as the memory load was increased from one to five faces. For primary features, this decline and the specific distributions of error observed, have been parsimoniously explained in terms of neural population codes. We adapted the population coding model for circular variables to the non-circular and bounded parameter space used for expression estimation. Total population activity was held constant according to the principle of normalization and the intensity of expression was decoded by drawing samples from the Bayesian posterior distribution. The model fit the data well, showing that principles of population coding can be applied to model memory representations at multiple levels of the visual hierarchy. When both gratings and faces had to be remembered, an asymmetry was observed. Increasing the number of faces decreased precision of orientation recall, but increasing the number of gratings did not affect recall of expression, suggesting that memorizing faces involves the automatic encoding of low-level features, in addition to higher-level expression information.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Experiment testing memory for faces expressing different emotions (Experiment 1A). A display containing between one and five faces was presented (duration 1 s/face), followed, after a 2 s retention interval, by a spatial cue and probe face. The participant adjusted the emotional intensity of the probe face to match the face that had appeared at the cued location (the target face). Emotion and identity of the probe always matched the target. The face images in the Figure 1 are from Radboud Faces Database (http://www.socsci.ru.nl:8180/RaFD2/RaFD); all of their images can be used in strictly scientific publications.
Figure 2
Figure 2
Recall performance for facial expressions. (A) Standard deviation of errors as a function of set size for five different emotional expressions (colored circles). (B) Bias of errors as a function of set size for different emotional expressions. The effect of expression intensity on standard deviation (C) and bias (D) of recall errors.
Figure 3
Figure 3
Recall performance for oriented gratings. (A) Mean distributions of response error for orientation recall, for set sizes 1, 3, and 5. The ML decoding (purple curves) and posterior sampling (yellow curves) versions of the neural resource models produced almost identical fits to the data. (B) Best fitting model parameters for each participant.
Figure 4
Figure 4
Memory error distributions (symbols) for faces and fits of the population coding model (lines). Error distributions averaged over expression (top row) are shown as well as for happy (bottom row, purple symbols and lines) and sad (bottom row, green symbols and lines) expressions separately, as a function of memory load from one to five faces (plots from left to right). In all plots, data is averaged over participants and target intensities. Data and code for plotting individual error distributions and fits are available at https://osf.io/v79h6/.
Figure 5
Figure 5
Population coding model parameters for different facial expressions. (A) Gain constants differed between expressions. (B) Tuning widths across expressions were roughly constant.
Figure 6
Figure 6
Bias of the error distributions for different set sizes (one to five faces, plots from left to right). Different expressions averaged.
Figure 7
Figure 7
Recall variability for facial expressions and grating orientations. (A) Effect of set size on recall of expressions (red; Euclidean SD, on left y-axis) and orientations (blue; circular SD, on right y-axis), tested separately. (B) Orientation recall for memory arrays consisting of one or three gratings plus zero or one face. Storing a facial expression increases orientation recall variability. (C) Recall of facial expressions for memory arrays consisting of one face plus zero, one, or three gratings. Storing orientations does not affect facial expression recall variability. Note SD values are on different scales for faces and orientations and not directly comparable.

Similar articles

Cited by

References

    1. Alpers G. W, Adolph D, Pauli P. Emotional scenes and facial expressions elicit different psychophysiological responses. International Journal of Psychophysiology. (2011);80(3):173–181. doi: 10.1016/j.ijpsycho.2011.01.010. - DOI - PubMed
    1. Alvarez G. A, Cavanagh P. The capacity of visual short-term memory is set both by visual information load and by number of objects. Psychological Science. (2004);15(2):106–111. doi: 10.1111/j.0963-7214.2004.01502006.x. - DOI - PubMed
    1. Barraclough N. E, Perrett D. I. From single cells to social perception. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences. (2011);366(1571):1739–1752. doi: 10.1098/rstb.2010.0352. - DOI - PMC - PubMed
    1. Bays P. M. Noise in neural populations accounts for errors in working memory. The Journal of Neuroscience. (2014);34(10):3632–3645. doi: 10.1523/JNEUROSCI.3204-13.2014. - DOI - PMC - PubMed
    1. Bays P. M, Catalao R. F, Husain M. The precision of visual working memory is set by allocation of a shared resource. Journal of Vision. (2009);9(10):1–11.:7. doi: 10.1167/9.10.7. PubMed Article https://doi.org/10.1167/9.10.7 PubMed ] [ PubMed ] [ Article. - DOI - PMC - PubMed

Publication types