Controlling a Mobile Robot with Natural Commands based on Voice and Gesture - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2013

Controlling a Mobile Robot with Natural Commands based on Voice and Gesture

Résumé

This paper presents a real-time system for the control of a small mobile robot using combined audio (speech) and video (gesture) commands. Commercial hardware is used based on open-source code. Gesture is recognised using a dynamic time warp (DTW) algorithm using skeleton points derived from the RGB-D camera of the Kinect sensor. We present the integration of a faster parallel version of the DTW algorithm. Speech is recognised using a reduced-vocabulary HMM toolkit. Audio beam forming is exploited for localisation of the person relative to the robot. Separate commands are passed to a fusion centre which resolves conflicting and complementary instructions. This means complex commands such as "go there" and "come here" may be recognised without a complex scene model. We provide comprehensive analysis of the performance in an indoor, reverberant environment.
Fichier principal
Vignette du fichier
RobotCommandingByVoiceAndGesture.pdf (1.58 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02996999 , version 1 (09-11-2020)

Identifiants

  • HAL Id : hal-02996999 , version 1

Citer

A. R. Fardana, S. Jain, Igor Jovančević, Y. Suri, C. Morand, et al.. Controlling a Mobile Robot with Natural Commands based on Voice and Gesture. IEEE International Conference on Robotics and Automation (ICRA) - Workshop on Human Robot Interaction for Assistance and Industrial Robots, May 2013, Karlsruhe, Germany. ⟨hal-02996999⟩
60 Consultations
59 Téléchargements

Partager

Gmail Facebook X LinkedIn More