Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Comparative Study
. 2001 Aug;13(4):239-52.
doi: 10.1002/hbm.1036.

fMRI investigation of sentence comprehension by eye and by ear: modality fingerprints on cognitive processes

Affiliations
Comparative Study

fMRI investigation of sentence comprehension by eye and by ear: modality fingerprints on cognitive processes

E B Michael et al. Hum Brain Mapp. 2001 Aug.

Abstract

The neural substrate underlying reading vs. listening comprehension of sentences was compared using fMRI. One way in which this issue was addressed was by comparing the patterns of activation particularly in cortical association areas that classically are implicated in language processing. The precise locations of the activation differed between the two modalities. In the left inferior frontal gyrus (Broca's area), the activation associated with listening was more anterior and inferior than the activation associated with reading, suggesting more semantic processing during listening comprehension. In the left posterior superior and middle temporal region (roughly, Wernicke's area), the activation for listening was closer to primary auditory cortex (more anterior and somewhat more lateral) than the activation for reading. In several regions, the activation was much more left lateralized for reading than for listening. In addition to differences in the location of the activation, there were also differences in the total amount of activation in the two modalities in several regions. A second way in which the modality comparison was addressed was by examining how the neural systems responded to comprehension workload in the two modalities by systematically varying the structural complexity of the sentences to be processed. Here, the distribution of the workload increase associated with the processing of additional structural complexity was very similar across the two input modalities. The results suggest a number of subtle differences in the cognitive processing underlying listening vs. reading comprehension.

PubMed Disclaimer

Figures

Figure 1
Figure 1
The schematic drawing in the center of the figure shows several of the left hemisphere ROIs, adapted from the parcellation scheme described and depicted in Caviness et al. [1996]: Inferior Frontal Gyrus, Temporal, Extrastriate (which also includes inferior temporal), Inferior Parietal, Superior Parietal, Dorsolateral Prefrontal Cortex, and the Frontal Eye Fields. Each area shaded in gray represents an ROI, as indicated by the arrows and associated labels. The associated graphs depict the amount of activation in a given ROI as a function of modality and sentence complexity. Amount of activation is defined as the percent change in signal intensity (as compared to fixation) summed across all voxels in an ROI that are active (t > 5.0) in any condition. Note that the graphs are not all on the same scale. Error bars represent 95% confidence intervals calculated as the square root of MS e /n, where MS e is the pooled error term for both of the independent variables [Loftus and Masson, 1994]. Confidence intervals that descended below zero were truncated at the abscissa.
Figure 2
Figure 2
Each graph depicts the amount of activation, defined as in Figure 1, in a given ROI as a function of modality and sentence complexity for the right homologues. Most show patterns similar to those for the corresponding left hemisphere region, with the exception of the right temporal region. In each graph the scale is the same as that of its left homologue in Figure 1.
Figure 3
Figure 3
An activation image of one slice for one participant, with the left temporal ROI outlined, for each of the four experimental conditions (Visual Active, Visual Object Relative, Auditory Active, and Auditory Object Relative). The volume of activation was greater in the auditory conditions than in the visual conditions, and the amount of activation increased with increasing sentence complexity.
Figure 4
Figure 4
The effect of presentation modality in left inferior frontal gyrus for Object Relative sentences is shown for one representative sagittal slice of one participant. Yellow voxels were active in both the visual and auditory conditions, the single red voxel was active in only the visual condition, and blue voxels were active in only the auditory condition.
Figure 5
Figure 5
For each condition, the voxels that activated above threshold in that condition were divided into sets according to the other conditions in which they were also active. The height of each bar segment represents the number of voxels in that set. Note that the upper panel (left temporal) and lower panel (left inferior frontal gyrus) use different scales.

References

    1. Awh E, Jonides J, Smith EE, Schumacher EH, Koeppe RA, Katz S (1996): Dissociation of storage and rehearsal in verbal working memory: evidence from positron emission tomography. Psychol Sci 7: 25–31.
    1. Bavelier D, Corina D, Jezzard P, Padmanabhan S, Clark VP, Karni A, Prinster A, Braun A, Lalwani A, Rauschecker JP, Turner R, Neville H (1997): Sentence reading: a functional MRI study at 4 Tesla. J Cogn Neurosci 9: 664–686. - PubMed
    1. Binder JR, Frost JA, Hammeke TA, Cox RW, Rao SM, Prieto T (1997): Human brain language areas identified by functional magnetic resonance imaging. J Neurosci 17: 353–362. - PMC - PubMed
    1. Caplan D, Alpert N, Waters G (1998): Effects of syntactic structure and propositional number on patterns of regional cerebral blood flow. J Cogn Neurosci 10: 541–552. - PubMed
    1. Caplan D, Alpert N, Waters G (1999): PET studies of syntactic processing with auditory sentence presentation. Neuroimage 9: 343–351. - PubMed

Publication types