Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Comparative Study
. 2005 May 25;25(21):5148-58.
doi: 10.1523/JNEUROSCI.0419-05.2005.

Distinct cortical pathways for processing tool versus animal sounds

Affiliations
Comparative Study

Distinct cortical pathways for processing tool versus animal sounds

James W Lewis et al. J Neurosci. .

Abstract

Human listeners can effortlessly categorize a wide range of environmental sounds. Whereas categorizing visual object classes (e.g., faces, tools, houses, etc.) preferentially activates different regions of visually sensitive cortex, it is not known whether the auditory system exhibits a similar organization for different types or categories of complex sounds outside of human speech. Using functional magnetic resonance imaging, we show that hearing and correctly or incorrectly categorizing animal vocalizations (as opposed to hand-manipulated tool sounds) preferentially activated middle portions of the left and right superior temporal gyri (mSTG). On average, the vocalization sounds had much greater harmonic and phase-coupling content (acoustically similar to human speech sounds), which may represent some of the signal attributes that preferentially activate the mSTG regions. In contrast, correctly categorized tool sounds (and even animal sounds that were miscategorized as being tool-related sounds) preferentially activated a widespread, predominantly left hemisphere cortical "mirror network." This network directly overlapped substantial portions of motor-related cortices that were independently activated when participants pantomimed tool manipulations with their right (dominant) hand. These data suggest that the recognition processing for some sounds involves a causal reasoning mechanism (a high-level auditory "how" pathway), automatically evoked when attending to hand-manipulated tool sounds, that effectively associates the dynamic motor actions likely to have produced the sound(s).

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
BOLD responses to correctly categorized tool and animal sounds (n = 20; all panels corrected to α < 0.05). Cortical regions activated by both tool and animal sounds (yellow to orange) relative to silence (a), tool-related sounds (red) relative to silence (b), and animal vocalization sounds (blue) relative to silence (c). Light green depicts cortex showing decreased BOLD signal relative to silence. d, Data from c effectively subtracted from b, revealing regions preferentially activated by tool sounds (red) or by animal sounds (blue). Bottom panels depict highly smoothed renderings of the Colin Atlas brain. e, Flat maps showing data from d. Black dashed outlines depict the approximate location of PAC+ from a (yellow; p < 0.00005). Identified visual areas (solid black outlines; V1, V2, V3, MT+, etc.) are from the Colin Atlas database. For details, see Results and Table 1. AS, Angular sulcus; CaS, calcarine sulcus; CeS, central sulcus; CiS, cingulate sulcus; CoS, collateral sulcus; FG, fusiform gyrus; IFG, inferior frontal gyrus; IFS, inferior frontal sulcus; IPrCeS, inferior precentral sulcus; IPS, intraparietal sulcus; ITS, inferotemporal sulcus; LaS, lateral sulcus; LOS, lateral occipital sulcus; MTG, middle temporal gyrus; Orb. S, orbital sulcus; OTS, occipito-temporal sulcus; PAC, primary auditory cortex; pITS, posterior inferotemporal sulcus; pMTG, posterior middle temporal gyrus; PoCeS, postcentral sulcus; POS, parieto-occipital sulcus; SFS, superior frontal sulcus; SPrCeS, superior precentral sulcus; STG, superior temporal gyrus; STS, superior temporal sulcus; TOS, transverse occipital sulcus.
Figure 2.
Figure 2.
Group-averaged ROI analysis of activated areas from Figure 1d. Charts illustrate the mean and SE across 20 participants (unthresholded) of the relative BOLD signal intensity (arbitrary units) in response to correctly categorized tool sounds (“T,” red) versus silence and animal sounds (“A,” dark blue) versus silence. They also depict responses to animal sounds miscategorized as tools (“T̄,” pink) and tool sounds miscategorized as animals (“Ā,” light blue), across 17 participants who made errors in both sound categories. The left mesial frontal focus from Figure 1e showed BOLD responses below baseline (silence) to all sounds (data not shown). Brain images (axial slices) are from one participant and transformed into Talairach coordinate space.
Figure 3.
Figure 3.
Group activation data evoked by manipulations of virtual tools using the right hand (green; n = 12; corrected α < 0.05) superimposed onto the data from Figure 1, d and e. The intermediate colors yellow and cyan depict regions of overlap. Some of the cerebellar cortex activation extended into the ventral occipital lobes but was removed for clarity. Other conventions are as in Figure 1.
Figure 4.
Figure 4.
Quantitative comparisons of acoustical differences between categories of sound, illustrating the similarity between animal sounds (blue) and human speech sounds (black) relative to tool sounds (red). a-c, Example sound stimuli illustrating the corresponding amplitude plot, spectrograph, power spectrum (percentage power vs frequency), and HNR. d, Power spectra averaged across the 94 retained tool sounds (top) and 94 animal sounds (bottom) compared with 94 human speech sounds. Note that, although the average RMS power was balanced across sound categories, the average power showed differences. The average power beyond 10 kHz in all categories was negligible and thus is not shown. e, Mean (plus variance) of HNR values across sound categories. f, Bicoherence analysis comparing the three categories of sound, illustrating different phase-coupling profiles. These and other samples of the sound stimuli can be heard at www.jneurosci.org, as supplemental material.

References

    1. Amedi A, Malach R, Hendler T, Peled S, Zohary E (2001) Visuo-haptic object-related activation in the ventral visual pathway. Nat Neurosci 4: 324-330. - PubMed
    1. Bandettini PA, Jesmanowicz A, Wong EC, Hyde JS (1993) Processing strategies for functional MRI of the human brain. Magn Reson Med 30: 161-173. - PubMed
    1. Beauchamp M, Lee K, Haxby J, Martin A (2002) Parallel visual motion processing streams for manipulable objects and human movements. Neuron 34: 149-159. - PubMed
    1. Beauchamp MS, Lee KM, Argall BD, Martin A (2004) Integration of auditory and visual information about objects in superior temporal sulcus. Neuron 41: 809-823. - PubMed
    1. Belin P, Zatorre RJ, Lafaille P, Ahad P, Pike B (2000) Voice-selective areas in human auditory cortex. Nature 403: 309-312. - PubMed

Publication types

MeSH terms

LinkOut - more resources