Gesture Recognition Based on the Fusion of Hand Positioning and Arm Gestures - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Journal of Robotics and Mechatronics Année : 2006

Gesture Recognition Based on the Fusion of Hand Positioning and Arm Gestures

Résumé

To improve the link between operators and equipment, communication systems have begun using natural (user-oriented) languages such as speech and gestures. Our goal is to present gesture recognition based on the fusion of measurements from different sources. Sensors must be able to capture at least the location and orientation of the hand, as is done by Dataglove and a video camera. Datagloge gives the hand position and the video camera gives the general arm gesture representing the gesture's physical and spatial properties based on the two-dimensional (2D) skeleton representation of the arm. Measurement is partly complementary and partly redundant. The application is distributed over intelligent co-operating sensors. We detail the measurement of hand positioning and arm gestures, fusion processes, and implementation.
Fichier non déposé

Dates et versions

hal-00428594 , version 1 (29-10-2009)

Identifiants

  • HAL Id : hal-00428594 , version 1

Citer

Didier Coquin, Eric Benoit, Hideyuki Sawada, Bogdan Ionescu. Gesture Recognition Based on the Fusion of Hand Positioning and Arm Gestures. Journal of Robotics and Mechatronics, 2006, 18 (6), pp.751-759. ⟨hal-00428594⟩
39 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More