Multimodal Indices To Japanese And French Prosodically Expressed Social Affects - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Language and Speech Année : 2009

Multimodal Indices To Japanese And French Prosodically Expressed Social Affects

Résumé

Whereas several studies have explored the expression of emotions, little is known on how the visual and audio channels are combined during production of what we call the more controlled social affects, for example, “attitudinal” expressions. This article presents a perception study of the audovisual expression of 12 Japanese and 6 French attitudes in order to understand the contribution of audio and visual modalities for affective communication. The relative importance of each modality in the perceptual decoding of the expressions of four speakers is analyzed as a first step towards a deeper comprehension of their influence on the expression of social affects. Then, the audovisual productions of two speakers (one for each language) are acoustically (F0, duration and intensity) and visually (in terms of Action Units) analyzed, in order to match the relation between objective parameters and listeners' perception of these social affects. The most pertinent objective features, either acoustic or visual, are then discussed, in a bilingual perspective: for example, the relative influence of fundamental frequency for attitudinal expression in both languages is discussed, and the importance of a certain aspect of the voice quality dimension in Japanese is underlined.

Dates et versions

hal-00444381 , version 1 (06-01-2010)

Identifiants

Citer

Albert Rilliard, Takaaki Shochi, Jean-Claude Martin, Donna Erickson, Véronique Aubergé. Multimodal Indices To Japanese And French Prosodically Expressed Social Affects. Language and Speech, 2009, 52 (2&3), pp.223-243. ⟨10.1177/0023830909103171⟩. ⟨hal-00444381⟩
370 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More