Event-Related Potentials Associated with Somatosensory Effect in Audio-Visual Speech Perception

Takayuki Ito 1, 2 Hiroki Ohashi 2 Eva Montas 2 Vincent Gracco 3, 2
1 GIPSA-PCMD - PCMD
GIPSA-DPC - Département Parole et Cognition
Abstract : Speech perception often involves multisensory processing. Although previous studies have demonstrated visual [1, 2] and somatosensory interactions [3, 4] with auditory processing, it is not clear whether somatosensory information can contribute to the processing of audiovisual speech perception. This study explored the neural consequence of somatosensory interactions in audiovisual speech processing. We assessed whether somatosensory orofacial stimulation influenced event-related potentials (ERPs) in response to an audiovisual speech illusion (the McGurk Effect [1]). 64 scalp sites of ERPs were recorded in response to audiovisual speech stimulation and somatosensory stimulation. In the audiovisual condition, an auditory stimulus /ba/ was synchronized with the video of congruent facial motion (the production of /ba/) or incongruent facial motion (the production of the /da/: McGurk condition). These two audiovisual stimulations were randomly presented with and without somatosensory stimulation associated with facial skin deformation. We found ERPs differences associated with the McGurk effect in the presence of the somatosensory conditions. ERPs for the McGurk effect reliably diverge around 280 ms after auditory onset. The results demonstrate a change of cortical potential of audiovisual processing due to somatosensory inputs and suggest that somatosensory information encoding facial motion also influences speech processing.
Type de document :
Communication dans un congrès
Interspeech 2017, Aug 2017, Stockholm, Sweden. Interspeech 2017, pp.669 - 673, 2017, <http://www.interspeech2017.org>. <10.21437/Interspeech.2017-139>
Liste complète des métadonnées


https://hal.archives-ouvertes.fr/hal-01583102
Contributeur : Takayuki Ito <>
Soumis le : mercredi 6 septembre 2017 - 17:03:54
Dernière modification le : vendredi 8 septembre 2017 - 15:31:45

Fichier

Ito_Interspeech2017.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Collections

Citation

Takayuki Ito, Hiroki Ohashi, Eva Montas, Vincent Gracco. Event-Related Potentials Associated with Somatosensory Effect in Audio-Visual Speech Perception. Interspeech 2017, Aug 2017, Stockholm, Sweden. Interspeech 2017, pp.669 - 673, 2017, <http://www.interspeech2017.org>. <10.21437/Interspeech.2017-139>. <hal-01583102>

Partager

Métriques

Consultations de
la notice

36

Téléchargements du document

5