The timing of visual speech modulates auditory neural processing - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Brain and Language Année : 2022

The timing of visual speech modulates auditory neural processing

Résumé

In face-to-face communication, visual information from a speaker’s face and time-varying kinematics of articulatory movements have been shown to fine-tune auditory neural processing and improve speech recognition. To further determine whether the timing of visual gestures modulates auditory cortical processing, three sets of syllables only differing in the onset and duration of silent prephonatory movements, before the acoustic speech signal, were contrasted using EEG. Despite similar visual recognition rates, an increase in the amplitude of P2 auditory evoked responses was observed from the longest to the shortest movements. Taken together, these results clarify how audiovisual speech perception partly operates through visually-based predictions and related processing time, with acoustic–phonetic neural processing paralleling the timing of visual prephonatory gestures.
Fichier principal
Vignette du fichier
Sato-INT-Manuscript-R2-notmarked.pdf (795.43 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03832830 , version 1 (28-10-2022)

Identifiants

Citer

Marc Sato. The timing of visual speech modulates auditory neural processing. Brain and Language, 2022, 235, pp.105196. ⟨10.1016/j.bandl.2022.105196⟩. ⟨hal-03832830⟩
69 Consultations
25 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More