Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 May 18;36(20):5462-71.
doi: 10.1523/JNEUROSCI.4310-15.2016.

Multimodal Feature Integration in the Angular Gyrus during Episodic and Semantic Retrieval

Affiliations

Multimodal Feature Integration in the Angular Gyrus during Episodic and Semantic Retrieval

Heidi M Bonnici et al. J Neurosci. .

Abstract

Much evidence from distinct lines of investigation indicates the involvement of angular gyrus (AnG) in the retrieval of both episodic and semantic information, but the region's precise function and whether that function differs across episodic and semantic retrieval have yet to be determined. We used univariate and multivariate fMRI analysis methods to examine the role of AnG in multimodal feature integration during episodic and semantic retrieval. Human participants completed episodic and semantic memory tasks involving unimodal (auditory or visual) and multimodal (audio-visual) stimuli. Univariate analyses revealed the recruitment of functionally distinct AnG subregions during the retrieval of episodic and semantic information. Consistent with a role in multimodal feature integration during episodic retrieval, significantly greater AnG activity was observed during retrieval of integrated multimodal episodic memories compared with unimodal episodic memories. Multivariate classification analyses revealed that individual multimodal episodic memories could be differentiated in AnG, with classification accuracy tracking the vividness of participants' reported recollections, whereas distinct unimodal memories were represented in sensory association areas only. In contrast to episodic retrieval, AnG was engaged to a statistically equivalent degree during retrieval of unimodal and multimodal semantic memories, suggesting a distinct role for AnG during semantic retrieval. Modality-specific sensory association areas exhibited corresponding activity during both episodic and semantic retrieval, which mirrored the functional specialization of these regions during perception. The results offer new insights into the integrative processes subserved by AnG and its contribution to our subjective experience of remembering.

Significance statement: Using univariate and multivariate fMRI analyses, we provide evidence that functionally distinct subregions of angular gyrus (AnG) contribute to the retrieval of episodic and semantic memories. Our multivariate pattern classifier could distinguish episodic memory representations in AnG according to whether they were multimodal (audio-visual) or unimodal (auditory or visual) in nature, whereas statistically equivalent AnG activity was observed during retrieval of unimodal and multimodal semantic memories. Classification accuracy during episodic retrieval scaled with the trial-by-trial vividness with which participants experienced their recollections. Therefore, the findings offer new insights into the integrative processes subserved by AnG and how its function may contribute to our subjective experience of remembering.

Keywords: fMRI; memory; multivoxel pattern analysis; parietal lobe; recollection.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Episodic and semantic memory tasks performed inside the scanner. A, Episodic memory: Participants completed 72 trials of the episodic task. In each trial, participants first saw a cue telling them which clip to recall. They then closed their eyes for 6 s and recollected the clip as vividly as possible. Once the 6 s had elapsed, they heard a tone signaling them to open their eyes. They then rated the vividness of the recollection as well as stating in which of the three modalities the clip had originally been presented. They then had a 4 s fixation pause before the next trial commenced. B, Semantic memory. Participants completed 72 trials of the semantic task. Each trial started with a cue denoting the word to which they were to associate words. They then closed their eyes for 6 s and thought of as many words associated with the cue as possible. Once the 6 s had elapsed, they heard a tone signaling them to open their eyes and to rate how related the words they thought of were to the cue word, as well as the number of words that they came up with in the 6 s (1 meaning none, 2 up to three words, and 3 >3 words). They then had a 4 s fixation pause before the next trial commenced.
Figure 2.
Figure 2.
Decoding of episodic memories. Individual memories were decoded using a 10-fold cross-validation approach. Importantly, classification was undertaken separately within the auditory (A, green), within the visual (B, purple), and within the audio-visual (C, blue) condition. For example, the classifier was trained to differentiate the auditory memories (A) for “chirp,” “siren,” and “knock” using a subset of the data. It was then tested on a left out set of auditory trials (that was not included in the training data set). A corresponding approach was used for trials of the visual memoires (B), as well as the audio-visual memories (C). D, Three-way classification accuracies in each of the ROIs: MTG, FG, and AnG. Note: Chance = 33.33%. Error bars indicate SEM. *p < 0.05, **p < 0.005, one-tailed t test.
Figure 3.
Figure 3.
Modality-specific regional signal change during episodic and semantic memory. A, Left, Percentage signal change for auditory (green), visual (purple), and audio-visual (blue) memories in the peak voxel for the contrast “audio-visual” in MTG during episodic memory. Right, Left MTG activation for the same contrast. B, Same as A but for semantic memory. C, Overlay of activation maps shown in A and B. Red indicates activations selective to episodic memory, green indicates activations selective to semantic memory, and yellow indicates an overlap in activations. D, Left, Percentage signal change for auditory (green), visual (purple), and audio-visual (blue) memories in the peak voxel for the contrast “visual − auditory” in the FG during episodic memory. Right, Left FG activation for the same contrast. E, Same as D but for semantic memory. F, Overlay of activation maps shown in D and E. Note: Peak voxels were identified within the predefined spherical 6 mm ROIs using the contrasts specified above. Error bars indicate SEM. *p < 0.05, two-tailed t tests. Activations are shown at a threshold of p < 0.05 (whole-brain, uncorrected) and for the purposes of visualization are masked to the AAL mask for the ROI.
Figure 4.
Figure 4.
Modality-specific regional signal change during episodic and semantic memory. A, Left, Percentage signal change for auditory (green), visual (purple), and audio-visual (blue) memories in the peak AnG voxel (−42, −66, 42) for the contrast “audio-visual − audio + visual” during episodic memory in the predefined spherical 6 mm ROI. Right, Left AnG activation for the same contrast. B, Left, Same as in A but for the peak AnG voxel (−42, −78, 36) for the main effect of audio-visual trials during semantic memory. Right, Left AnG activation for the same contrast. C, Overlay of activation maps shown in A and B. Red indicates activations selective to episodic memory, green indicates activations selective to semantic memory, and yellow indicates an overlap in activations. Activations are shown at a threshold of p < 0.05 (whole-brain, uncorrected) and are masked to the AAL mask for the ROI. The overlap shown in yellow comprised 1 voxel (−45, −72, 36) and did not reach significance in a conjunction analysis (t(15) = 1.85, p = 0.160). Note: Peak voxels were identified within the predefined spherical 6 mm ROIs using the contrasts specified above. Error bars indicate SEM. *p < 0.05, two-tailed t tests. Activations are shown at a threshold of p < 0.05 (whole-brain, uncorrected) and for the purposes of visualization are masked to the AAL mask for the ROI.

Comment in

References

    1. Andersson JL, Hutton C, Ashburner J, Turner R, Friston K. Modeling geometric deformations in EPI time series. Neuroimage. 2001;13:903–919. - PubMed
    1. Ashburner J. A fast diffeomorphic image registration algorithm. Neuroimage. 2007;38:95–113. doi: 10.1016/j.neuroimage.2007.07.007. - DOI - PubMed
    1. Berryhill ME, Phuong L, Picasso L, Cabeza R, Olson IR. Parietal lobe and episodic memory: bilateral damage causes impaired free recall of autobiographical memory. J Neurosci. 2007;27:14415–14423. doi: 10.1523/JNEUROSCI.4163-07.2007. - DOI - PMC - PubMed
    1. Binder JR, Desai RH, Graves WW, Conant LL. Where is the semantic system? A critical review and meta-analysis of 120 functional neuroimaging studies. Cereb Cortex. 2009;19:2767–2796. doi: 10.1093/cercor/bhp055. - DOI - PMC - PubMed
    1. Bonner MF, Peelle JE, Cook PA, Grossman M. Heteromodal conceptual processing in the angular gyrus. Neuroimage. 2013;71:175–186. doi: 10.1016/j.neuroimage.2013.01.006. - DOI - PMC - PubMed

Publication types

LinkOut - more resources