Electrophysiological evidence for Audio-visuo-lingual speech integration

Avril Treille 1 Coriandre Vilain 2 Jean-Luc Schwartz 1 Thomas Hueber 3 Marc Sato 4
1 GIPSA-PCMD - PCMD
GIPSA-DPC - Département Parole et Cognition
2 GIPSA-Services - GIPSA-Services
GIPSA-lab - Grenoble Images Parole Signal Automatique
3 GIPSA-CRISSP - CRISSP
GIPSA-DPC - Département Parole et Cognition
Abstract : Recent neurophysiological studies demonstrate that audio-visual speech integration partly operates through temporal expectations and speech-specific predictions. From these results, one common view is that the binding of auditory and visual, lipread, speech cues relies on their joint probability and prior associative audio-visual experience. The present EEG study examined whether visual tongue movements integrate with relevant speech sounds, despite little associative audio-visual experience between the two modalities. A second objective was to determine possible similarities and differences of audio-visual speech integration between unusual audio-visuo-lingual and classical audio-visuo-labial modalities. To this aim, participants were presented with auditory, visual, and audio-visual isolated syllables, with the visual presentation related to either a sagittal view of the tongue movements or a facial view of the lip movements of a speaker, with lingual and facial movements previously recorded by an ultrasound imaging system and a video camera. In line with previous EEG studies, our results revealed an amplitude decrease and a latency facilitation of P2 auditory evoked potentials in both audio-visual-lingual and audio-visuo-labial conditions compared to the sum of unimodal conditions. These results argue against the view that auditory and visual speech cues solely integrate based on prior associative audio-visual perceptual experience. Rather, they suggest that dynamic and phonetic informational cues are sharable across sensory modalities, possibly through a cross-modal transfer of implicit articulatory motor knowledge.
Document type :
Journal articles
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-02074993
Contributor : Avril Treille <>
Submitted on : Thursday, March 21, 2019 - 10:05:22 AM
Last modification on : Friday, May 31, 2019 - 12:34:25 PM

Links full text

Identifiers

Citation

Avril Treille, Coriandre Vilain, Jean-Luc Schwartz, Thomas Hueber, Marc Sato. Electrophysiological evidence for Audio-visuo-lingual speech integration. Neuropsychologia, Elsevier, 2018, 109, pp.126-133. ⟨10.1016/j.neuropsychologia.2017.12.024⟩. ⟨hal-02074993⟩

Share

Metrics

Record views

73