Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Mar 20;39(12):2208-2220.
doi: 10.1523/JNEUROSCI.2289-18.2018. Epub 2019 Jan 16.

Representation of Auditory Motion Directions and Sound Source Locations in the Human Planum Temporale

Affiliations

Representation of Auditory Motion Directions and Sound Source Locations in the Human Planum Temporale

Ceren Battal et al. J Neurosci. .

Erratum in

Abstract

The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is, however, poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to sounds moving left, right, up, and down as well as static sounds. Whole-brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human planum temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were, however, significantly distinct. Altogether, our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENT Compared with what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human planum temporale (hPT) and that they rely on partially shared pattern geometries. Our study, therefore, sheds important new light on how computing the location or direction of sounds is implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a "preferred axis of motion" organization, reminiscent of the coding mechanisms typically observed in the occipital middle-temporal cortex (hMT+/V5) region for computing visual motion.

Keywords: auditory motion; direction selectivity; fMRI; multivariate analyses; planum temporale; spatial hearing.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Stimuli and experimental design. A, The acoustic apparatus used to present auditory moving and static sounds while binaural recordings were performed for each participant before the fMRI session. B, Auditory stimuli presented inside the MRI scanner consisted of the following eight conditions: leftward, rightward, upward and downward moving stimuli; and left, right, up, and down static stimuli. Each condition was presented for 15 s (12 repetitions of 1250 ms sound, no ISI), and followed by a 7 s gap for indicating the corresponding direction/location in space and 8 s of silence (total interblock interval was 15 s). Sound presentation and response button press were pseudorandomized. Participants were asked to respond as accurately as possible during the gap period. C, The behavioral performance inside the scanner.
Figure 2.
Figure 2.
Univariate whole-brain results. A, Association test map was obtained from the on-line tool Neurosynth using the term Planum Temporale (FDR corrected, p < 0.05). The black spheres are illustrations of a drawn mask (radius = 6 mm, 117 voxels) around the peak coordinate from Neurosynth (search term Planum Temporale, meta-analysis of 85 studies). B, Auditory motion processing [Motion > Static] thresholded at p < 0.05, whole-brin FWE corrected. C, Mean activity estimates (arbitrary units ± SEM) associated with the perception of auditory motion direction (red) and sound source location (blue). ML, Motion left; MR, motion right; MD, motion down; MU, motion up; SL, static left; SR, static right; SD, static down; and SU, static up.
Figure 3.
Figure 3.
Within-classification and cross-classification results. A, Classification results for the four conditions in the functionally defined hPT region. Within-condition and cross-condition classification results are shown in the same bar plots. Moving, four motion directions; Static, four sound source locations; Cross, cross-condition classification accuracies. B, Classification results of within-axes (left vs right, up vs down) and across-axes (left vs up, left vs down, right vs up, right vs down) motion directions. C, Classification results of within-axes (left vs right, up vs down) and across-axes (left vs up, left vs down, right vs up, right vs down) sound source locations. LvsR, Left vs Right; UvsD, Up vs Down; LvsU, Left vs Up; LvsD, Left vs Down; RvsU, Right vs Up; RvsD, Right vs Down classifications. FDR-corrected p values: *p < 0.05, **p < 0.01, ***p < 0.001 testing differences against chance level (dotted lines; see Materials and Methods).
Figure 4.
Figure 4.
Pattern dissimilarity between motion directions and sound source locations. A, Across-condition classification results across four conditions are represented in each ROI (lhPT and rhPT). Four binary classifications [leftward motion vs left location], [rightward motion vs right location], [upward motion vs up location], and [downward motion vs down location] were computed and averaged to produce one accuracy score per ROI. FDR-corrected p values: ***p < 0.001. Dotted lines represent chance level. B, The inset shows neural RDMs extracted from lhPT and rhPT, and the MDS plot visualizes the similarities of the neural pattern elicited by four motion directions (arrows) and four sound source locations (dots). Color codes for arrow/dots are as follows: green indicates left direction/location; red indicates right direction/location; orange indicates up direction/location; and blue indicates down direction/location. ML, Motion left; MR, motion right; MD, motion down; MU, motion up; SL, static left; SR, static right; SD, static down; SU, static up. C–E, The results of RSA in hPT are represented. C, RDMs of the computational models that assume different similarities of the neural pattern based on auditory motion and static conditions. D, E, RSA results for every model and each ROI. For each ROI, the dotted lines represent the reliability of the data considering the signal-to-noise ratio (see Materials and Methods), which provides an estimate of the highest correlation we can expect in a given ROI when correlating computational models and neural RDMs. Error bars indicate the SEM. IM1, Intermediate models with within-axis conditions distinct; IM2, Intermediate model with within-axis conditions combined. Each right up corner of the bar plots shows visualization of significant differences for each class of models and hemispheres separately (Mann–Whitney–Wilcoxon rank-sum test, FDR corrected).

Similar articles

Cited by

References

    1. Ahissar M, Ahissar E, Bergman H, Vaadia E (1992) Encoding of sound-source location and movement: activity of single neurons and interactions between adjacent neurons in the monkey auditory cortex. J Neurophysiol 67:203–215. 10.1152/jn.1992.67.1.203 - DOI - PubMed
    1. Ahveninen J, Jaaskelainen IP, Raij T, Bonmassar G, Devore S, Hämäläinen M, Levänen S, Lin FH, Sams M, Shinn-Cunningham BG, Witzel T, Belliveau JW (2006) Task-modulated “what” and “where” pathways in human auditory cortex. Proc Natl Acad Sci U S A 103:14608–14613. 10.1073/pnas.0510480103 - DOI - PMC - PubMed
    1. Ahveninen J, Huang S, Nummenmaa A, Belliveau JW, Hung AY, Jääskeläinen IP, Rauschecker JP, Rossi S, Tiitinen H, Raij T (2013) Evidence for distinct human auditory cortex regions for sound location versus identity processing. Nat Commun 4:2585. 10.1038/ncomms3585 - DOI - PMC - PubMed
    1. Alain C, Arnott SR, Hevenor S, Graham S, Grady CL (2001) “What” and “where” in the human auditory system. Proc Natl Acad Sci U S A 98:12301–12306. 10.1073/pnas.211209098 - DOI - PMC - PubMed
    1. Albright TD, Desimone R, Gross CG (1984) Columnar organization of directionally selective cells in visual area MT of the macaque. J Neurophysiol 51:16–31. 10.1152/jn.1984.51.1.16 - DOI - PubMed

Publication types

LinkOut - more resources