Speech in the mirror? Neurobiological correlates of self speech perception

Avril Treille 1 Coriandre Vilain 2 Sonia Kandel 3 Jean-Luc Schwartz 1 Marc Sato 4
1 GIPSA-PCMD - PCMD
GIPSA-DPC - Département Parole et Cognition
2 GIPSA-Services - GIPSA-Services
GIPSA-lab - Grenoble Images Parole Signal Automatique
3 GIPSA-VSLD - VSLD
GIPSA-DPC - Département Parole et Cognition
Abstract : Self-awareness and self-recognition during action observation may partly result from a functional matching between action and perception systems. This perception-action interaction enhances the integration between sensory inputs and our own sensory-motor knowledge. We present combined EEG and fMRI studies examining the impact of self-knowledge on multisensory integration mechanisms. More precisely, we investigated this impact during auditory, visual and audio-visual speech perception. Our hypothesis was that hearing and/or viewing oneself talk would facilitate the bimodal integration process and activate sensory-motor maps to a greater extent than observing others. In both studies, half of the stimuli presented the participants’ own productions (self condition) and the other half presented an unknown speaker (other condition). For the “self” condition, we recorded videos of each participant producing/pa/, /ta/ and /ka/ syllables. In the “other” condition, we recorded videos of a speaker the participants had never met producing the same syllables. These recordings were then presented in different modalities: auditory only (A), visual only (V), audio-visual (AV) and incongruent audiovisual (AVi – incongruency referred to different speakers for the audio and video components). In the EEG experiment, 18 participants had to categorize the syllables. In the fMRI experiment, 12 participants had listen to and/or view passively the syllables. In the EEG session, audiovisual interactions were estimated by comparing auditory N1/P2 ERPs during bimodal responses (AV) with the sum of the responses in A and V only conditions (A+V). The amplitude of P2 ERPs was lower for AV than A+V. Importantly, latencies for N1 ERPs were shorter for the “Visual-self” condition than the “Visual-other”, regardless of signal type. In the fMRI session, the presentation modality had an impact on brain activation: activation was stronger for audio or audiovisual stimuli in the superior temporal auditory regions (A= AV=AVi> V), and for video or audiovisual stimuli in MT/V5 and in the premotor cortices (V=AV=AVi> A). In addition, brain activity was stronger in the “self” than the “other” condition both at the left posterior inferior frontal gyrus and cerebellum (lobules I-IV). In line with previous studies on multimodal speech perception, our results point to the existence of integration mechanisms of auditory and visual speech signals. Critically, they further demonstrate a processing advantage when the perceptual situation involves our own speech production. In addition, hearing and/or viewing oneself talk increased activation in the left posterior IFG and cerebellum. These regions are generally responsible for predicting sensory outcomes of action generation. Altogether, these results suggest that viewing our own utterances leads to a temporal facilitation of auditory and visual speech integration. Moreover, processing afferent and efferent signals in sensory-motor areas leads to self -awareness during speech perception.
Type de document :
Poster
Seventh Annual Meeting of the Society for the Neurobiology of Language, Oct 2015, Chicago, United States. 〈http://www.neurolang.org/future-meetings/〉
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01297700
Contributeur : Avril Treille <>
Soumis le : mardi 5 avril 2016 - 09:50:47
Dernière modification le : lundi 9 avril 2018 - 12:22:50
Document(s) archivé(s) le : mercredi 6 juillet 2016 - 11:50:12

Fichier

NLC_self_EEG&IRMf_poster_FINAL...
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01297700, version 1

Citation

Avril Treille, Coriandre Vilain, Sonia Kandel, Jean-Luc Schwartz, Marc Sato. Speech in the mirror? Neurobiological correlates of self speech perception. Seventh Annual Meeting of the Society for the Neurobiology of Language, Oct 2015, Chicago, United States. 〈http://www.neurolang.org/future-meetings/〉. 〈hal-01297700〉

Partager

Métriques

Consultations de la notice

457

Téléchargements de fichiers

62