Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2017 Oct:105:243-252.
doi: 10.1016/j.neuropsychologia.2017.04.008. Epub 2017 Apr 8.

A multisensory perspective on object memory

Affiliations
Review

A multisensory perspective on object memory

Pawel J Matusz et al. Neuropsychologia. 2017 Oct.

Abstract

Traditional studies of memory and object recognition involved objects presented within a single sensory modality (i.e., purely visual or purely auditory objects). However, in naturalistic settings, objects are often evaluated and processed in a multisensory manner. This begets the question of how object representations that combine information from the different senses are created and utilised by memory functions. Here we review research that has demonstrated that a single multisensory exposure can influence memory for both visual and auditory objects. In an old/new object discrimination task, objects that were presented initially with a task-irrelevant stimulus in another sense were better remembered compared to stimuli presented alone, most notably when the two stimuli were semantically congruent. The brain discriminates between these two types of object representations within the first 100ms post-stimulus onset, indicating early "tagging" of objects/events by the brain based on the nature of their initial presentation context. Interestingly, the specific brain networks supporting the improved object recognition vary based on a variety of factors, including the effectiveness of the initial multisensory presentation and the sense that is task-relevant. We specify the requisite conditions for multisensory contexts to improve object discrimination following single exposures, and the individual differences that exist with respect to these improvements. Our results shed light onto how memory operates on the multisensory nature of object representations as well as how the brain stores and retrieves memories of objects.

Keywords: Auditory; Cross-modal; Learning; Memory; Multisensory; Object; Visual.

PubMed Disclaimer

Figures

Figure 1
Figure 1
A cartoon of a cocktail party setting. This is a typical scenario where multisensory information that is synchronous, co-localised and semantically-congruent co-occurs with information that is none of these. It is also exemplary of a scenario where information must be learned for later recognition in a different context.
Figure 2
Figure 2
a. Schematic of the multisensory continuous recognition task. When vision is the task-relevant sensory modality, the participant indicates if the image is being presented for the first or a repeated time. Initial presentations are divided between those that are unisensory visual and those which are multisensory. The multisensory context varies according to the semantic content of the sound (here congruent, meaningless, or incongruent). Repeated presentations are exclusively visual and therefore differ only in how they had been initially experienced (denoted by V−, V+c, V+m, and V+i). In a block of trials, all of these stimulus conditions are inter-mixed. b. Summary of behavioural findings. Accuracy for the various repeated presentations are displayed. The blue lines refer to studies where the task was performed in the visual modality, while green lines refer to studies where the task was performed in the auditory modality. Across studies, it can be seen that stimuli that had been initially presented in a semantically congruent multisensory context result in higher accuracy than stimuli that had only been experienced in a unisensory context. Other had-been multisensory contexts generally result in no difference or even performance impairment relative to the unisensory context.
Figure 3
Figure 3
Typical ERP findings showing differences between responses to unisensory stimuli (visual on the left side of the figure and auditory on the right side of the figure) according to whether they had been initially encountered in a semantically congruent multisensory context or unisensory context (V+c/A+c and V−/A−, respectively). The uppermost row shows ERPs from a right parieto-occipital electrode (P8) and fronto-central electrode (FCz). The shaded region shows periods of significant modulation. The middle row shows that these ERP modulations were due to topographic differences between conditions. Topographic maps are displayed on a flattened projection of the electrode montage, with nasion upward and the left hemisphere on the left. Red colours indicate positive potential, and blue colours indicate negative potential. The lowermost row shows loci of significant differences in distributed source estimations. For the visual task, stronger source activity was observed for V+c than V− within the right LOC. For the auditory task, stronger source activity was observed for A+c than A− within the right STC. Full details can be found in the original publications (Murray et al., 2004 and Matusz et al., 2015).

Similar articles

Cited by

References

    1. Amedi A, Raz N, Pianka P, Malach R, Zohary E. Early “visual” cortex activation correlates with superior verbal memory performance in the blind. Nature Neuroscience. 2003;6(7):758–766. http://doi.org/10.1038/nn1072. - DOI - PubMed
    1. Bach S, Richardson U, Brandeis D, Martin E, Brem S. Print-specific multimodal brain activation in kindergarten improves prediction of reading skills in second grade. Neuroimage. 2013;82:605–615. - PubMed
    1. Baddeley A, Eysenck MW, Anderson MC. Memory. Psychology Press; 2009.
    1. Baum SH, Stevenson RA, Wallace MT. Behavioral, perceptual, and neural alterations in sensory and multisensory function in autism spectrum disorder. Progress in Neurobiology. 2015;134:140–60. - PMC - PubMed
    1. Beauchamp MS, Argall BD, Bodurka J, Duyn JH, Martin A. Unraveling multisensory integration: Patchy organization within human STS multisensory cortex. Nature Neuroscience. 2004;7(11):1190–1192. http://doi.org/10.1038/nn1333. - DOI - PubMed