Skip to Main content Skip to Navigation
Conference papers

Online Human Activity Recognition for Ergonomics Assessment

Adrien Malaisé 1 Pauline Maurice 1 Francis Colas 1 Serena Ivaldi 1
1 LARSEN - Lifelong Autonomy and interaction skills for Robots in a Sensing ENvironment
Inria Nancy - Grand Est, LORIA - AIS - Department of Complex Systems, Artificial Intelligence & Robotics
Abstract : We address the problem of recognizing the current activity performed by a human worker, providing an information useful for automatic ergonomic evaluation of workstations for industrial applications.Traditional ergonomic assessment methods rely on pen-and-paper worksheet, such as the Er-gonomic Assessment Worksheet (EAWS). Nowadays, there exists no tool to automatically estimate the ergonomics score from sensors (external cameras or wearable sensors). As the ergonomic evaluation depends of the activity that is being performed, the first step towards a fully automatic ergonomic assessment is to automatically identify the different activities within an industrial task. To address this problem, we propose a method based on wearable sensors and supervised learning based on Hidden Markov Model (HMM). The activity recognition module works in two steps. First, the parameters of the model are learned offline from observation based on both sensors, then in a second stage, the model can be used to recognize the activity offline and online. We apply our method to recognize the current activity of a worker during a series of tasks typical of the manufacturing industry. We recorded 6 participants performing a sequence of tasks using wearable sensors.Two systems were used: the MVN Link suit from Xsens and the e-glove from Emphasis Telematics (See Fig. 1). The first consists of 17 wireless inertial sensors embedded in a lycra suit, and is used to track the whole-body motion. The second is a glove that includes pressure sensors on fingertips, and finger flexion sensors. The motion capture data are combined with the one from the glove and fed to our activity recognition model. The tasks were designed to involve elements of EAWS such as load handling, screwing and manipulating objects while in different static postures. The data are labeled following the EAWS categories such as " standing bent forward " , " overhead work " or " kneeling ". In terms of performances, the model is able to recognize the activities related to EAWS with 91% of precision by using a small subset of features such as the vertical position of the center of mass, the velocity of the center of mass and the angle of the L5S1 joint.
Document type :
Conference papers
Complete list of metadata

Cited literature [7 references]  Display  Hide  Download
Contributor : Adrien Malaisé Connect in order to contact the contributor
Submitted on : Monday, October 8, 2018 - 3:17:18 PM
Last modification on : Thursday, January 20, 2022 - 5:26:29 PM
Long-term archiving on: : Wednesday, January 9, 2019 - 12:48:26 PM


Files produced by the author(s)


  • HAL Id : hal-01808832, version 1



Adrien Malaisé, Pauline Maurice, Francis Colas, Serena Ivaldi. Online Human Activity Recognition for Ergonomics Assessment. SIAS 2018 - 9ème conférence internationale sur la sécurité des systèmes industriels automatisés, Oct 2018, Nancy, France. ⟨hal-01808832⟩



Record views


Files downloads