Skip to Main content Skip to Navigation
Conference papers

3D-Posture Recognition Using Joint Angle Representation

Abstract : This paper presents an approach for action recognition performed by human using the joint angles from skeleton information. Unlike classical approaches that focus on the body silhouette, our approach uses body joint angles estimated directly from time-series skeleton sequences captured by depth sensor. In this context, 3D joint locations of skeletal data are initially processed. Furthermore, the 3D locations computed from the sequences of actions are described as the angles features. In order to generate prototypes of actions poses, joint features are quantized into posture visual words. The temporal transitions of the visual words are encoded as symbols for a Hidden Markov Model (HMM). Each action is trained through the HMM using the visual words symbols, following, all the trained HMM are used for action recognition.
Complete list of metadata

Cited literature [25 references]  Display  Hide  Download
Contributor : Youssef Chahir Connect in order to contact the contributor
Submitted on : Tuesday, September 18, 2018 - 10:53:06 AM
Last modification on : Saturday, June 25, 2022 - 9:52:36 AM
Long-term archiving on: : Wednesday, December 19, 2018 - 1:40:03 PM


Files produced by the author(s)


  • HAL Id : hal-01168442, version 1


Adnan Al Alwani, Youssef Chahir, Djamal Goumidi, Michèle Molina, François Jouen. 3D-Posture Recognition Using Joint Angle Representation. 15th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU), Jul 2014, Montpellier, France. ⟨hal-01168442⟩



Record views


Files downloads