Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2011 Nov 30;31(48):17496-504.
doi: 10.1523/JNEUROSCI.5030-10.2011.

Influence of static eye and head position on tone-evoked gaze shifts

Affiliations

Influence of static eye and head position on tone-evoked gaze shifts

Tom J Van Grootel et al. J Neurosci. .

Abstract

The auditory system represents sound-source directions initially in head-centered coordinates. To program eye-head gaze shifts to sounds, the orientation of eyes and head should be incorporated to specify the target relative to the eyes. Here we test (1) whether this transformation involves a stage in which sounds are represented in a world- or a head-centered reference frame, and (2) whether acoustic spatial updating occurs at a topographically organized motor level representing gaze shifts, or within the tonotopically organized auditory system. Human listeners generated head-unrestrained gaze shifts from a large range of initial eye and head positions toward brief broadband sound bursts, and to tones at different center frequencies, presented in the midsagittal plane. Tones were heard at a fixed illusory elevation, regardless of their actual location, that depended in an idiosyncratic way on initial head and eye position, as well as on the tone's frequency. Gaze shifts to broadband sounds were accurate, fully incorporating initial eye and head positions. The results support the hypothesis that the auditory system represents sounds in a supramodal reference frame, and that signals about eye and head orientation are incorporated at a tonotopic stage.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Example vector schemes for remapping sound into a goal-directed gaze shift. A, The head-centered scheme. The head-centered sound representation (TH, gray arrow) should be transformed into an oculocentric motor command (ΔG, bold black arrow), by subtracting eye-in-head orientation (EH, gray dotted arrow; Eq. 1). B, C, The world-centered scheme. B, Head-centered sound coordinates are transformed into a supramodal world-centered (TS, bold black arrow) reference frame, by incorporating head orientation at sound onset (HS,0, gray dotted arrow; Eq. 2). C, The world-centered sound (TS, gray arrow) is transformed into an oculocentric motor command (ΔG, bold black arrow) by incorporating both eye and head orientation (EH and HS, gray dotted arrows; Eq. 3).
Figure 2.
Figure 2.
Spatial and temporal layout of experimental trials. A–C, Spatial configurations of EH, HS, and EH&HS variable blocks. Diamonds, Sound locations; head and gaze positions are indicated by squares and dots, respectively. D–F, Temporal events for the same blocks as in A–C. G–I, Distributions of elevation difference between actual and desired eye-in-head (EH, G), head in space (HS, H), and eye-in-space (ES, I). Bins are 2° in elevation and subjects are stacked (gray shades). Because ES = HS + EH, and ES is accurate (i.e., errors are near zero), EH and HS distributions are negatively correlated.
Figure 3.
Figure 3.
Gaze (black) and head (gray) trajectories of example trials of listener S6. Horizontal dashed lines indicate sound-in-space locations (thick black line: target timing). Dots on abscissa, LED location that guided the eye position either with the head aligned or straight-ahead. A, C, Gaze shifts to BB noise stimuli are accurate. B, D, Saccades to tones (9 and 5 kHz) are not goal-directed. A, B, Variable HS, EH centered in head. C, D, Variable EH, HS straight ahead.
Figure 4.
Figure 4.
Elevation localization performance to broadband noise (A–D) and the 5 kHz tone (E–H) of listener S6, for all initial eye and head orientations. A, E, Gaze-shift (ΔG) response as a function of gaze-motor error (required gaze shift, TE = THEH). Dashed diagonal lines indicate the ideal, accurate response. Solid black lines show the best linear fit to the data. Note the contrast between highly accurate gaze shifts for broadband sounds and highly inaccurate responses for tones. B–D and F–H show the residual effect of a single variable on the gaze shift (after taking into account the other variables through the MLR-analysis) for broadband sounds and tones, respectively. Solid black lines represent the MLR slopes. B, Strong correlation of broadband-evoked gaze shift with target-in-space (TS). C, Strong negative effect of head-in-space orientation (HS). D, Strong negative effect of eye-in-head orientation (EH). For tones, responses are not goal-directed (D), but do vary systematically with HS (G) and EH (H). Diagonal lines with slope −1 indicate the ideal response results. Thick dashed lines in G indicate the predicted influence of head orientation for the head- and world-centered schemes. The coefficients according to the MLR (Eq. 5) are shown above the panels. Values that differ significantly from zero are indicated by an asterisk (*).
Figure 5.
Figure 5.
Results of MLR (Eq. 5) for all listeners (gray-coded) to broadband stimuli in elevation for different static initial eye and head orientations. Data from all recording sessions were pooled. Response bias for each subject is indicated at the right side. Note the different scales. Error bars correspond to one standard deviation. Dashed lines at +1 and −1 denote ideal target representation and full compensation of eye and head positions, respectively.
Figure 6.
Figure 6.
A, MLR results for 5 kHz tone-evoked responses of all listeners (gray-coded). B, MLR results for the responses of listener S6 to all tones. C, Regression coefficients of the gaze shifts for all subjects to broadband noise (black lines and circles) and all tones (black lines and gray patch), presented as cumulative distributions.
Figure 7.
Figure 7.
Comparison between the goodness of fit for the gaze-shift data by the world-centered (Eq. 5) and head-centered regression model (Eq. 6). Symbols correspond to all broadband noise (squares: N = 8) and pure tone experiments (circles; frequency in kHz indicated; N = 36) for all subjects (gray-shaded uniquely). Each symbol gives the adjusted R2 value for either model (Eq. 7) for each subject and stimulus. Note that the models cannot be dissociated for the broadband-evoked gaze shifts, because the contribution of target location and head orientation are close to the ideal values of +1 and −1, respectively (see Fig. 4B,C). Hence the adjusted R2 values lie along the identity line. The tone responses, however, do provide a clear distinction, as the far majority of points lie below the diagonal. This indicates that the responses can be best explained by the world-centered model.
Figure 8.
Figure 8.
Conceptual scheme of the audio-motor system. Sound is filtered by the pinna-related head-related transfer functions (HRTF) before reaching the cochlea, which encodes a head-centered target location. The ascending auditory pathway is tonotopically organized. Initial head position interacts in a frequency-specific way (connectivity strength indicated by circles) to represent a world-centered target representation in the population of narrow-band channels. To program the eye–head gaze shift, signals about current head and eye position interact with the world-centered target signal at the level of the tonotopic arrays, and at the SC gaze-motor map. The eye-centered gaze command thus depends on eye and head position. The head-motor command only depends on current eye position (light-gray arrow). In this way, the eye-in-space and the head are driven by different neural commands: ΔG and ΔH.

Similar articles

Cited by

References

    1. Agterberg MJ, Snik AF, Hol MK, van Esch TE, Cremers CW, Van Wanrooij MM, Van Opstal AJ. Improved horizontal directional hearing in bone conduction device users with acquired unilateral conductive hearing loss. J Assoc Res Otolaryngol. 2011;12:1–11. - PMC - PubMed
    1. Altmann CF, Wilczek E, Kaiser J. Processing of auditory location changes after horizontal head rotation. J Neurosci. 2009;29:13074–13078. - PMC - PubMed
    1. Blauert J. Spatial hearing: the psychophysics of human sound localization. Cambridge, MA: MIT Press; 1997. Revised edition.
    1. Bremen P, van Wanrooij MM, van Opstal AJ. Pinna cues determine orienting response modes to synchronous sounds in elevation. J Neurosci. 2010;30:194–204. - PMC - PubMed
    1. Chen LL. Head movements evoked by electrical stimulation in the frontal eye field of the monkey: evidence for independent eye and head control. J Neurophysiol. 2006;95:3528–3542. - PubMed

Publication types

LinkOut - more resources