Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Jan;31(1):30-39.
doi: 10.3766/jaaa.18049. Epub 2019 Jun 14.

Visual Reliance During Speech Recognition in Cochlear Implant Users and Candidates

Affiliations

Visual Reliance During Speech Recognition in Cochlear Implant Users and Candidates

Aaron C Moberly et al. J Am Acad Audiol. 2020 Jan.

Abstract

Background: Adults with cochlear implants (CIs) are believed to rely more heavily on visual cues during speech recognition tasks than their normal-hearing peers. However, the relationship between auditory and visual reliance during audiovisual (AV) speech recognition is unclear and may depend on an individual's auditory proficiency, duration of hearing loss (HL), age, and other factors.

Purpose: The primary purpose of this study was to examine whether visual reliance during AV speech recognition depends on auditory function for adult CI candidates (CICs) and adult experienced CI users (ECIs).

Study sample: Participants included 44 ECIs and 23 CICs. All participants were postlingually deafened and had met clinical candidacy requirements for cochlear implantation.

Data collection and analysis: Participants completed City University of New York sentence recognition testing. Three separate lists of twelve sentences each were presented: the first in the auditory-only (A-only) condition, the second in the visual-only (V-only) condition, and the third in combined AV fashion. Each participant's amount of "visual enhancement" (VE) and "auditory enhancement" (AE) were computed (i.e., the benefit to AV speech recognition of adding visual or auditory information, respectively, relative to what could potentially be gained). The relative reliance of VE versus AE was also computed as a VE/AE ratio.

Results: VE/AE ratio was predicted inversely by A-only performance. Visual reliance was not significantly different between ECIs and CICs. Duration of HL and age did not account for additional variance in the VE/AE ratio.

Conclusions: A shift toward visual reliance may be driven by poor auditory performance in ECIs and CICs. The restoration of auditory input through a CI does not necessarily facilitate a shift back toward auditory reliance. Findings suggest that individual listeners with HL may rely on both auditory and visual information during AV speech recognition, to varying degrees based on their own performance and experience, to optimize communication performance in real-world listening situations.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Boxplots of scores for ECI users on CUNY sentence recognition in AV, A-only, and V-only conditions. For each condition, the median is represented by the horizontal line that divides the box into two parts. The upper limit of the box represents the 75th percentile and the lower limit of the box represents the 25th percentile. Upper and lower whiskers represent maximum and minimum scores, respectively.
Figure 2.
Figure 2.
Boxplots of scores for CICs on CUNY sentence recognition in AV, A-only, and V-only conditions. For each condition, the median is represented by the horizontal line that divides the box into two parts. The upper limit of the box represents the 75th percentile and the lower limit of the box represents the 25th percentile. Upper and lower whiskers represent maximum and minimum scores, respectively.
Figure 3.
Figure 3.
Scatterplot of VE versus AE scores for ECI users on CUNY sentence recognition.
Figure 4.
Figure 4.
Scatterplot of VE versus AE scores for cochlear implant CICs on CUNY sentence recognition.

References

    1. Altieri NA, Pisoni DB, Townsend JT. (2011) Some normative data on lip-reading skills. J Acoust Soc Am 130(1):1–4. - PMC - PubMed
    1. Boothroyd A, Hanin L, Hnath T. (1985a) CUNY Laser Videodisk of Everyday Sentences. New York, NY: Speech and Hearing Sciences Research Center; City University of New York.
    1. Boothroyd A, Hanin L, Hnath T. (1985b) A sentence Test of Speech Perception: Reliability, Set Equivalence, and Short-Term Learning Internal Report RCI 10. New York, NY: Speech and Hearing Sciences Research Center; City University of New York.
    1. Desai S, Stickney G, Zeng FG. (2008) Auditory-visual speech perception in normal-hearing and cochlear-implant listeners. J Acoust Soc Am 123(1):428–440. - PMC - PubMed
    1. Dorman MF, Liss J, Wang S, Berisha V, Ludwig C, Natale SC. (2016) Experiments on auditory-visual perception of sentences by users of unilateral, bimodal, and bilateral cochlear implants. J Speech Lang Hear Res 59(6):1505–1519. - PMC - PubMed

Publication types