Electrophysiological evidence for audio-visuo-lingual speech integration - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2015

Electrophysiological evidence for audio-visuo-lingual speech integration

Résumé

Audio-visual speech perception is a special case of multisensory processing that interfaces with the linguistic system. One important issue is whether cross-modal interactions only depend on well-known auditory and visuo-facial modalities or, rather, might also be triggered by other sensory sources less common in speech communication. The present EEG study aimed at investigating cross-modal interactions not only between auditory, visuo-facial and audio-visuo-facial syllables but also between auditory, visuo-lingual and audio-visuo-lingual syllables.Eighteen adults participated in the study, none of them being experienced with visuo-lingual stimuli. The stimuli were acquired by means of a camera and an ultrasound system, synchronized with the acoustic signal. At the behavioral level, visuo-lingual syllables were recognized far above chance, although to a lower degree than visuo-labial syllables. At the brain level, audiovisual interactions were estimated by comparing the EEG responses to the multisensory stimuli (AV) to the combination of responses to the stimuli presented in isolation (A+V). For both visuo-labial and visuo-lingual syllables, a reduced latency and a lower amplitude of P2 auditory evoked potentials were observed for AV compared to A+V. Apart from this sub-additive effect, a reduced amplitude of N1 and a higher amplitude of P2 were also observed for lingual compared to labial movements.Although participants were not experienced with visuo-lingual stimuli, our results demonstrate that they were able to recognize them and provide the first evidence for audio-visuo-lingual speech interactions. These results further emphasize the multimodal nature of speech perception and likely reflect the impact of listener's knowledge of speech production.
Fichier principal
Vignette du fichier
IMRF-EEG-tongue-Final.pdf (564.02 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01297678 , version 1 (05-04-2016)

Identifiants

  • HAL Id : hal-01297678 , version 1

Citer

Coriandre Emmanuel Vilain, Avril Treille, Marc Sato. Electrophysiological evidence for audio-visuo-lingual speech integration. IMRF 2015 - 16th International Multisensory Research Forum, Jun 2015, Pise, Italy. ⟨hal-01297678⟩
188 Consultations
214 Téléchargements

Partager

Gmail Facebook X LinkedIn More