Integration of auditory, labial and manual signals in cued speech perception by deaf adults : an adaptation of the McGurk paradigm

Abstract : Among deaf individuals fitted with a cochlear implant, some use Cued Speech (CS; a system in which each syllable is uttered with a complementary manual gesture) and therefore, have to combine auditory, labial and manual information to perceive speech. We examined how audio-visual (AV) speech integration is affected by the presence of manual cues and on which form of information (auditory, labial or manual) the CS receptors primarily rely depending on labial ambiguity. To address this issue, deaf CS users (N=36) and deaf CS naïve (N=35) participants were submitted to an identification task of two AV McGurk stimuli (either with a plosive or with a fricative consonant). Manual cues were congruent with either auditory information, lip information or the expected fusion. Results revealed that deaf individuals can merge audio and labial information into a single unified percept. Without manual cues, participants gave a high proportion of fusion response (particularly with ambiguous plosive McGurk stimuli). Results also suggested that manual cues can modify the AV integration and that their impact differs between plosive and fricative McGurk stimuli.
Document type :
Conference papers
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01302422
Contributor : Clémence Bayard <>
Submitted on : Friday, April 7, 2017 - 11:04:04 AM
Last modification on : Thursday, March 14, 2019 - 1:19:49 AM

Identifiers

  • HAL Id : hal-01302422, version 1

Collections

Citation

Clémence Bayard, Jacqueline Leybaert, Cécile Colin. Integration of auditory, labial and manual signals in cued speech perception by deaf adults : an adaptation of the McGurk paradigm. 1st Joint Conference on Facial Analysis, Animation and Auditory-Visual Speech Processing (FAAVSP 2015), Sep 2015, Vienne, Austria. ⟨hal-01302422⟩

Share

Metrics

Record views

275

Files downloads

33