Multisensory and sensorimotor interactions in speech perception

Kaisa Tiippana 1 Riikka Möttönen 2 Jean-Luc Schwartz 3
1 Institute of Behavioural Sciences
University of Helsinki [Helsinki]
GIPSA-DPC - Département Parole et Cognition
Abstract : This research topic presents speech as a natural, well-learned, multisensory communication signal, processed by multiple mechanisms. Reflecting the general status of the field, most articles focus on audiovisual speech perception and many utilize the McGurk effect, which arises when discrepant visual and auditory speech stimuli are presented (McGurk and MacDonald, 1976). Tiippana (2014) argues that the McGurk effect can be used as a proxy for multisensory integration provided it is not interpreted too narrowly. Several articles shed new light on audiovisual speech perception in special populations. It is known that individuals with autism spectrum disorder (ASD, e.g., Saalasti et al., 2012) or language impairment (e.g., Meronen et al., 2013) are generally less influenced by the talking face than peers with typical development. Here Stevenson et al. (2014) propose that a deficit in multisensory integration could be a marker of ASD, and a component of the associated deficit in communication. However, three studies suggest that integration is not deficient in some communication disorders. Irwin and Brancazio (2014) show that children with ASD looked less at the mouth region, resulting in poorer visual speech perception and consequently weaker visual influence. Leybaert et al. (2014) report that children with specific language impairment recognized visual and auditory speech less accurately than their controls, affecting audiovisual speech perception, while audiovisual integration per se seemed unimpaired. In a similar vein, adult patients with aphasia showed unisensory deficits but still integrated audiovisual speech information (Andersen and Starrfelt, 2015). Multisensory information can influence response accuracy and processing speed (e.g., Molholm et al., 2002; Klucharev et al., 2003). Scarbel et al. (2014) show that oral responses to speech in noise were faster but less accurate than manual responses, suggesting that oral responses are planned at an earlier stage than manual responses. Sekiyama et al. (2014) show that older adults were more influenced by visual speech than younger adults and correlated this fact to their slower reaction times to auditory stimuli. Altieri and Hudock (2014) report variation in reaction time and accuracy benefits for audiovisual speech in hearing-impaired observers, emphasizing the importance of individual differences in integration. Finally, Heald and Nusbaum (2014) show that when there were two possible talkers instead of just one, audiovisual information appeared to distract the observer from the task of word recognition and slowed down their performance. This finding demonstrates that multisensory stimulation does not always facilitate performance. While multisensory stimulation is thought to be beneficial for learning (Shams and Seitz, 2008), evidence for this is still scarce. In the current research topic, the overall utility of multisensory learning is brought under question. In a paradigm training to associate novel words and pictures , Bernstein et al. (2014) show no benefit of audiovisual presentation compared with auditory presentation for normal hearing individuals, and even a degradation for adults with hearing impairment. In a study of cued speech, i.e., specific hand-signs for different speech sounds, Bayard et al. (2014) demonstrate that individuals with hearing impairment used the visual cues differently from their controls, even though both groups were experts in cued speech. Kelly et al. (2014)
Type de document :
Article dans une revue
Frontiers in Psychology, Frontiers, 2015, 〈10.3389/fpsyg.2015.00458〉
Liste complète des métadonnées

Littérature citée [31 références]  Voir  Masquer  Télécharger
Contributeur : Jean-Luc Schwartz <>
Soumis le : vendredi 9 octobre 2015 - 17:03:57
Dernière modification le : lundi 9 avril 2018 - 12:22:49
Document(s) archivé(s) le : dimanche 10 janvier 2016 - 10:34:03


Fichiers éditeurs autorisés sur une archive ouverte




Kaisa Tiippana, Riikka Möttönen, Jean-Luc Schwartz. Multisensory and sensorimotor interactions in speech perception. Frontiers in Psychology, Frontiers, 2015, 〈10.3389/fpsyg.2015.00458〉. 〈hal-01214067〉



Consultations de la notice


Téléchargements de fichiers