Synchronizing to real events: subjective audiovisual alignment scales with perceived auditory depth and speed of sound
- PMID: 15668388
- PMCID: PMC548526
- DOI: 10.1073/pnas.0407034102
Synchronizing to real events: subjective audiovisual alignment scales with perceived auditory depth and speed of sound
Abstract
Because of the slow speed of sound relative to light, acoustic and visual signals from a distant event often will be received asynchronously. Here, using acoustic signals with a robust cue to sound source distance, we show that judgments of perceived temporal alignment with a visual marker depend on the depth simulated in the acoustic signal. For distant sounds, a large delay of sound relative to vision is required for the signals to be perceived as temporally aligned. For nearer sources, the time lag corresponding to audiovisual alignment is smaller and scales at rate approximating the speed of sound. Thus, when robust cues to auditory distance are present, the brain can synchronize disparate audiovisual signals to external events despite considerable differences in time of arrival at the perceiver. This ability is functionally important as it allows auditory and visual signals to be synchronized to the external event that caused them.
Figures
References
-
- Bald, L., Berrien, F. K., Price, J. B. & Sprague, R. O. (1942) J. Appl. Psychol. 26, 382-388.
-
- Hamlin, A. J. (1895) Am. J. Psychol. 6, 564-575.
-
- Hirsh, I. J. & Sherrick, C. E. (1961) J. Exp. Psychol. 62, 423-432. - PubMed
-
- Lewkowicz, D. J. (1996) J. Exp. Psychol. Hum. Percept. Perform. 5, 1094-1106. - PubMed
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
