Skip to Main content Skip to Navigation
Conference papers

3D-Posture Recognition Using Joint Angle Representation

Abstract : This paper presents an approach for action recognition performed by human using the joint angles from skeleton information. Unlike classical approaches that focus on the body silhouette, our approach uses body joint angles estimated directly from time-series skeleton sequences captured by depth sensor. In this context, 3D joint locations of skeletal data are initially processed. Furthermore, the 3D locations computed from the sequences of actions are described as the angles features. In order to generate prototypes of actions poses, joint features are quantized into posture visual words. The temporal transitions of the visual words are encoded as symbols for a Hidden Markov Model (HMM). Each action is trained through the HMM using the visual words symbols, following, all the trained HMM are used for action recognition.
Complete list of metadatas

Cited literature [25 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01168442
Contributor : Youssef Chahir <>
Submitted on : Tuesday, September 18, 2018 - 10:53:06 AM
Last modification on : Tuesday, May 5, 2020 - 11:50:16 AM
Document(s) archivé(s) le : Wednesday, December 19, 2018 - 1:40:03 PM

File

ipmu14action-final.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01168442, version 1

Citation

Adnan Al Alwani, Youssef Chahir, Djamal Goumidi, Michèle Molina, François Jouen. 3D-Posture Recognition Using Joint Angle Representation. 15th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU), Jul 2014, Montpellier, France. ⟨hal-01168442⟩

Share

Metrics

Record views

335

Files downloads

720