Seeing to hear better: evidence for early audio-visual interactions in speech identification - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Cognition Année : 2004

Seeing to hear better: evidence for early audio-visual interactions in speech identification

Résumé

Lip reading is the ability to partially understand speech by looking at the speaker's lips. It improves the intelligibility of speech in noise when audio-visual perception is compared with audio-only perception. A recent set of experiments showed that seeing the speaker's lips also enhances sensitivity to acoustic information, decreasing the auditory detection threshold of speech embedded in noise [J. Acoust. Soc. Am. 109 (2001) 2272; J. Acoust. Soc. Am. 108 (2000) 1197]. However, detection is different from comprehension, and it remains to be seen whether improved sensitivity also results in an intelligibility gain in audio-visual speech perception. In this work, we use an original paradigm to show that seeing the speaker's lips enables the listener to hear better and hence to understand better. The audio-visual stimuli used here could not be differentiated by lip reading per se since they contained exactly the same lip gesture matched with different compatible speech sounds. Nevertheless, the noise-masked stimuli were more intelligible in the audio-visual condition than in the audio-only condition due to the contribution of visual information to the extraction of acoustic cues. Replacing the lip gesture by a non-speech visual input with exactly the same time course, providing the same temporal cues for extraction, removed the intelligibility benefit. This early contribution to audio-visual speech identification is discussed in relationships with recent neurophysiological data on audio-visual perception.

Domaines

Psychologie
Fichier principal
Vignette du fichier
Schwartz_Cognition_2004.pdf (61.14 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-00186797 , version 1 (12-11-2007)

Identifiants

  • HAL Id : hal-00186797 , version 1

Citer

Jean-Luc Schwartz, Frédéric Berthommier, Christophe Savariaux. Seeing to hear better: evidence for early audio-visual interactions in speech identification. Cognition, 2004, 93, pp.B69-B78. ⟨hal-00186797⟩

Collections

UGA CNRS ICP
197 Consultations
1169 Téléchargements

Partager

Gmail Facebook X LinkedIn More