Skip to Main content Skip to Navigation
Conference papers

BLSTM-RNN based 3D Gesture Classification

Abstract : This paper presents a new robust method for inertial MEM (MicroElectroMechanical systems) 3D gesture recognition. The linear ac- celeration and the angular velocity, respectively provided by the accele- rometer and the gyrometer, are sampled in time resulting in 6D values at each time step which are used as inputs for the gesture recognition system. We propose to build a system based on Bidirectional Long Short- Term Memory Recurrent Neural Networks (BLSTM-RNN) for gesture classification from raw MEM data. We also compare this system to a ge- ometric approach using DTW (Dynamic Time Warping) and a statistical method based on HMM (Hidden Markov Model) from filtered and de- noised MEM data. Experimental results on 22 individuals producing 14 gestures in the air show that the proposed approach outperforms classi- cal classification methods with a classification mean rate of 95.57% and a standard deviation of 0.50 for 616 test gestures. Furthermore, these experiments underline that combining accelerometer and gyrometer in- formation gives better results that using a single inertial description.
Complete list of metadata

Cited literature [13 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01224806
Contributor : Grégoire Lefebvre <>
Submitted on : Thursday, November 5, 2015 - 10:47:14 AM
Last modification on : Monday, February 15, 2021 - 9:34:02 AM
Long-term archiving on: : Saturday, February 6, 2016 - 11:14:41 AM

File

ICANN2013.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01224806, version 1

Citation

Grégoire Lefebvre, Samuel Berlemont, Franck Mamalet, Christophe Garcia. BLSTM-RNN based 3D Gesture Classification. Artificial Neural Networks and Machine Learning, ICANN 2013, 23rd International Conference on Artificial Neural Networks, Sep 2013, Sofia, Bulgaria. ⟨hal-01224806⟩

Share

Metrics

Record views

673

Files downloads

1815