UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech

Banks, B; Gowen, E; Munro, K; Adank, P; (2021) Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech. Journal of Speech, Language, and Hearing Research , 64 (9) pp. 3432-3445. 10.1044/2021_JSLHR-21-00106. (In press). Green open access

[thumbnail of JSLHR-21-00106_R1.pdf]
Preview
Text
JSLHR-21-00106_R1.pdf - Accepted Version

Download (2MB) | Preview

Abstract

PURPOSE: Visual cues from a speaker's face may benefit perceptual adaptation to degraded speech, but current evidence is limited. We aimed to replicate results from previous studies to establish the extent to which visual speech cues can lead to greater adaptation over time, extending existing results to a real-time adaptation paradigm (i.e., without a separate training period). A second aim was to investigate whether eye gaze patterns toward the speaker's mouth were related to better perception, hypothesizing that listeners who looked more at the speaker's mouth would show greater adaptation. METHODS: A group of listeners (n = 30) was presented with 90 noise-vocoded sentences in audiovisual format, whereas a control group (n = 29) was presented with the audio signal only. Recognition accuracy was measured throughout and eye tracking was used to measure fixations toward the speaker's eyes and mouth in the audiovisual group. RESULTS: Previous studies were partially replicated: The audiovisual group had better recognition throughout and adapted slightly more rapidly, but both groups showed an equal amount of improvement overall. Longer fixations on the speaker's mouth in the audiovisual group were related to better overall accuracy. An exploratory analysis further demonstrated that the duration of fixations to the speaker's mouth decreased over time. CONCLUSIONS: The results suggest that visual cues may not benefit adaptation to degraded speech as much as previously thought. Longer fixations on a speaker's mouth may play a role in successfully decoding visual speech cues; however, this will need to be confirmed in future research to fully understand how patterns of eye gaze are related to audiovisual speech recognition. All materials, data, and code are available at https://osf.io/2wqkf/.

Type: Article
Title: Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech
Open access status: An open access version is available from UCL Discovery
DOI: 10.1044/2021_JSLHR-21-00106
Publisher version: https://doi.org/10.1044/2021_JSLHR-21-00106
Language: English
Additional information: This version is the author accepted manuscript. For information on re-use, please refer to the publisher's terms and conditions.
UCL classification: UCL
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences > Faculty of Brain Sciences
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences > Faculty of Brain Sciences > Div of Psychology and Lang Sciences
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences > Faculty of Brain Sciences > Div of Psychology and Lang Sciences > Speech, Hearing and Phonetic Sciences
URI: https://discovery.ucl.ac.uk/id/eprint/10128897
Downloads since deposit
6Downloads
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item