Perception of asynchronous and conflicting visual and auditory speech
- PMID: 8817903
- DOI: 10.1121/1.417342
Perception of asynchronous and conflicting visual and auditory speech
Erratum in
- J Acoust Soc Am 1997 Mar;101(3):1748
Abstract
Previous research has shown that perceivers naturally integrate auditory and visual information in face-to-face speech perception. Two experiments were carried out to study whether integration would be disrupted by differences in the stimulus onset asynchrony (SOA), the temporal arrival of the two sources of information. Synthetic visible and natural and synthetic auditory syllables off/ba/, /va/, [symbol: see text]a/, and /da/ were used in an expanded factorial design to present all possible combinations of the auditory and visual syllables, as well as the unimodal syllables. The fuzzy logical model of perception (FLMP), which accurately describes integration, was used to measure the degree to which integration of audible and visible speech occurred. These findings provide information about the temporal window of integration and its apparent dependence on the range of speech events in the test.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources
