Auditory and somatosensory interaction in speech perception in children and adults - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Frontiers in Human Neuroscience Année : 2019

Auditory and somatosensory interaction in speech perception in children and adults

Résumé

Multisensory integration (MSI) allows us to link sensory cues from multiple sources and plays a crucial role in speech development. However, it is not clear whether humans have an innate ability or whether repeated sensory input while the brain is maturing leads to efficient integration of sensory information in speech. We investigated the integration of auditory and somatosensory information in speech processing in a bimodal perceptual task in 15 young adults (age 19–30) and 14 children (age 5–6). The participants were asked to identify if the perceived target was the sound /e/ or /ø/. Half of the stimuli were presented under a unimodal condition with only auditory input. The other stimuli were presented under a bimodal condition with both auditory input and somatosensory input consisting of facial skin stretches provided by a robotic device, which mimics the articulation of the vowel /e/. The results indicate that the effect of somatosensory information on sound categorization was larger in adults than in children. This suggests that integration of auditory and somatosensory information evolves throughout the course of development.
Fichier principal
Vignette du fichier
Trudeau-Fisette2019_FrontHumNs (1).pdf (1.36 Mo) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-02304443 , version 1 (17-11-2020)

Identifiants

Citer

Pamela Trudeau-Fisette, Takayuki Ito, Lucie Ménard. Auditory and somatosensory interaction in speech perception in children and adults. Frontiers in Human Neuroscience, 2019, 13, ⟨10.3389/fnhum.2019.00344⟩. ⟨hal-02304443⟩
88 Consultations
57 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More