Skip to Main content Skip to Navigation
Journal articles

Virtual Agent for Deaf Signing Gestures

Abstract : We describe in this paper a system for automatically synthesizing deaf signing animations from motion data captured on real deaf subjects. Moreover, we create a virtual agent endowed with expressive gestures. Our attention is focused on the expressiveness of gesture (what type of gesture: fluidity, tension, anger) and on its semantic representations. Our approach relies on a data-driven animation scheme. From motion data captured thanks to an optical system and data gloves, we try to extract relevant features of communicative gestures, and to re-synthesize them afterwards with style variation. Within this framework, a motion database containing the whole body, hands motion and facial expressions has been built. The analysis of signals makes possible the enrichment of this database by including segmentation and annotation descriptors. Analysis and synthesis algorithms are applied to the generation of a set of French Sign Language gestures. Key words Communication for deaf people, sign language gestures, virtual signer agent, gesture database.
Complete list of metadatas

Cited literature [23 references]  Display  Hide  Download
Contributor : Nicolas Courty <>
Submitted on : Tuesday, June 22, 2010 - 2:48:33 PM
Last modification on : Wednesday, November 27, 2019 - 11:24:10 AM
Document(s) archivé(s) le : Friday, September 24, 2010 - 5:45:53 PM


Files produced by the author(s)


  • HAL Id : hal-00494241, version 1


Sylvie Gibet, Alexis Héloir, Nicolas Courty, Jean-François Kamp, Philippe Gorce, et al.. Virtual Agent for Deaf Signing Gestures. AMSE, Journal of the Association for the Advancement of Modelling and Simulation Techniques in Enterprises (Special edition HANDICAP), 2006, 67, pp.127--136. ⟨hal-00494241⟩



Record views


Files downloads