Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2009 May 20;29(20):6490-9.
doi: 10.1523/JNEUROSCI.5437-08.2009.

Natural, metaphoric, and linguistic auditory direction signals have distinct influences on visual motion processing

Affiliations

Natural, metaphoric, and linguistic auditory direction signals have distinct influences on visual motion processing

Sepideh Sadaghiani et al. J Neurosci. .

Abstract

To interact with our dynamic environment, the brain merges motion information from auditory and visual senses. However, not only "natural" auditory MOTION, but also "metaphoric" de/ascending PITCH and SPEECH (e.g., "left/right"), influence the visual motion percept. Here, we systematically investigate whether these three classes of direction signals influence visual motion perception through shared or distinct neural mechanisms. In a visual-selective attention paradigm, subjects discriminated the direction of visual motion at several levels of reliability, with an irrelevant auditory stimulus being congruent, absent, or incongruent. Although the natural, metaphoric, and linguistic auditory signals were equally long and adjusted to induce a comparable directional bias on the motion percept, they influenced visual motion processing at different levels of the cortical hierarchy. A significant audiovisual interaction was revealed for MOTION in left human motion complex (hMT+/V5+) and for SPEECH in right intraparietal sulcus. In fact, the audiovisual interaction gradually decreased in left hMT+/V5+ for MOTION > PITCH > SPEECH and in right intraparietal sulcus for SPEECH > PITCH > MOTION. In conclusion, natural motion signals are integrated in audiovisual motion areas, whereas the influence of culturally learnt signals emerges primarily in higher-level convergence regions.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Visual stimuli and experimental design. A, Visual stimuli consisted of a single dot, presented sequentially along a target trajectory (target stream). Unreliability was introduced by adding a second dot moving along the same trajectory but in the opposite direction (distractor stream). For illustration purposes, only seven (instead of 14) dot positions are represented. For details, see Materials and Methods. B, In a visual-selective attention paradigm, subjects discriminated the direction of the visual motion. Each experiment (i.e., MOTION, PITCH, and SPEECH) conformed to a factorial design manipulating (1) reliability of the visual motion direction (5 or 2 levels) and (2) the auditory direction signal (3 levels: directionally congruent, absent, or incongruent to the visual signal).
Figure 2.
Figure 2.
Behavioral biases induced by the different auditory direction signals. A, Psychometric functions with PSE of a representative subject from the psychophysics study were obtained for upward/rightward, no auditory, and downward/leftward conditions in the MOTION, PITCH, and SPEECH experiments. Error bars indicate 95% confidence intervals. B, C, Across-subjects mean of the PSE as a measure for directional bias in the psychophysics (B) and fMRI study (C). Error bars indicate ±SEM. The directional bias was significant and statistically indistinguishable across MOTION, PITCH, and SPEECH experiments.
Figure 3.
Figure 3.
Influence of auditory direction signals on visual motion processing in the MOTION (A) and SPEECH (B) experiments. Left, Parameter estimates for AVunreliable, Vunreliable, AVreliable, and Vreliable for the MOTION experiment in the left hMT+/V5+ [(−39, −72, 0)] (A) and the SPEECH experiment in the right IPS [(45, −45, 39)] (B). The bars for the bimodal conditions represent combined estimates from congruent (bottom part of the bars) and incongruent (top part of the bars) conditions. Error bars indicate 90% confidence intervals. Middle, The activations pertaining to the audiovisual interaction (AVunreliable − Vunreliable) > (AVreliable − Vreliable) for MOTION (A) and SPEECH (B) are displayed on axial and sagittal slices of a mean EPI image created by averaging the subjects' normalized echo planar images (height threshold, p < 0.01; spatial extent >0 voxels; see Materials and Methods). Right, Contextual modulation of the AV interaction: parameter estimates of the audiovisual interaction effect in MOTION (blue), PITCH (red), and SPEECH (green) are displayed for the left hMT+/V5+ [(−39, −72, 0)] (A) and the right IPS [(48, −45, 45)] (B). In the left hMT+/V5+, the interaction effect gradually decreases for MOTION > PITCH > SPEECH; in the right IPS, it gradually increases for MOTION < PITCH < SPEECH. Error bars indicate 90% confidence intervals.
Figure 4.
Figure 4.
Overview of the neural systems mediating the influence of naturalistic, metaphoric, and linguistic auditory direction signals on visual motion processing. The audiovisual interaction effects for MOTION (blue), PITCH (red), and SPEECH (green) are rendered on a template of the whole brain (height threshold, p < 0.01 uncorrected; spatial extent >10 voxels; see Materials and Methods).
Figure 5.
Figure 5.
Increased activations for incongruent relative to congruent visuoauditory stimuli in the SPEECH experiment are shown on axial and coronal slices of a mean echo planar image created by averaging the subjects' normalized echo planar images (height threshold, p < 0.01; spatial extent >0 voxels; see Materials and Methods).

Similar articles

Cited by

References

    1. Alais D, Burr D. No direction-specific bimodal facilitation for audiovisual motion detection. Brain Res Cogn Brain Res. 2004;19:185–194. - PubMed
    1. Alink A, Singer W, Muckli L. Capture of auditory motion by vision is represented by an activation shift from auditory to visual motion cortex. J Neurosci. 2008;28:2690–2697. - PMC - PubMed
    1. Bartels A, Zeki S, Logothetis NK. Natural vision reveals regional specialization to local motion and to contrast-invariant, global flow in the human brain. Cereb Cortex. 2008;18:705–717. - PubMed
    1. Baumann O, Greenlee MW. Neural correlates of coherent audiovisual motion perception. Cereb Cortex. 2007;17:1433–1443. - PubMed
    1. Bizley JK, Nodal FR, Bajo VM, Nelken I, King AJ. Physiological and anatomical evidence for multisensory interactions in auditory cortex. Cereb Cortex. 2007;17:2172–2189. - PMC - PubMed

Publication types

MeSH terms

LinkOut - more resources