Recognition of Technical Gestures for Human-Robot Collaboration in Factories

Abstract : —Enabling smooth Human-Robot collaboration requires enhancing perception and intelligence of robots, so that they can " understand " the actions performed by the humans with whom they are interacting. In this paper we are dealing with new industrial collaborative robots on assembly-line and supply-chain in automotive manufacturing. We are conducting research on technical gestures recognition, to allow the robot to understand which task is being executed by human worker, and react accordingly. We use two kinds of sensors: depth-camera for monitoring of human movements, and inertial sensors placed on tools. In this study, we propose and use a method for head and hands tracking using a top-view depth-map, and use HMM (Hidden Markov Models) to recognize gestures with these data. Then, we refine the results from the HMM with data from inertial sensors equipping tools. Our research shows that: i) using 3D-vision only, we can obtain already good results of gestures recognition for several workers: 80% of the gestures are correctly recognized, ii) exploiting data from tools equipped with inertial sensors significantly improve the recognition accuracy to 94% in the same multiuser evaluation. A first test of our method with a simple Human-Robot collaboration scenario is also described. I. INTRODUCTION Robots are becoming more and more present in our everyday life. They can be used for social interaction or for medical support. In the industrial context, collaborative robots are emerging that are intrinsically " safe ". These robots, devoted to tasks that are either of low added-value, or potential source of musculoskeletal disorders, are working nearby workers without barriers between them contrary to current robots in factories. Therefore, collaborative robots allow increased automation of factories, saving of space and cost while improving productivity in the industrial plants. This new configuration of collaboration between robots and humans on assembly-line and supply-chain is efficient only if human-robot collaboration can be smooth, i.e., the robot is following the human gestures in order to respond fluidly. To this end, and in order to ensure workers' safety, a collaborative robot has to be aware its environment, to be able to adapt its speed to the worker rapidity, and monitor worker's actions in order to ensure smooth cooperation. Gesture recognition can meet these needs: by recognizing the worker's gestures, the robot can recognise which task is being executed, adapt its speed and detect when something unexpected happens. One of the difficulties of this goal is that, contrary to most Human-Computer interactions where the user can adapt to the system, the worker must be able to work " as usal " and is not supposed to make any effort for his gestures to be correctly understood by the robot.
Liste complète des métadonnées

Cited literature [24 references]  Display  Hide  Download
Contributor : Eva Coupeté <>
Submitted on : Sunday, April 24, 2016 - 3:45:39 PM
Last modification on : Monday, November 12, 2018 - 10:56:42 AM
Document(s) archivé(s) le : Monday, July 25, 2016 - 10:28:54 AM


Files produced by the author(s)


  • HAL Id : hal-01306482, version 1


Eva Coupeté, Fabien Moutarde, Sotiris Manitsaris, Olivier Hugues. Recognition of Technical Gestures for Human-Robot Collaboration in Factories. The Ninth International Conference on Advances in Computer-Human Interactions, Apr 2016, Venise, Italy. ⟨hal-01306482⟩



Record views


Files downloads