POLIMOD Pipeline: documentation. Motion Capture, Visualization & Data Analysis for gesture studies - Archive ouverte HAL Accéder directement au contenu
Rapport (Rapport De Recherche) Année : 2018

POLIMOD Pipeline: documentation. Motion Capture, Visualization & Data Analysis for gesture studies

Dominique Boutet
Vincent Meyrueis
  • Fonction : Auteur
  • PersonId : 880925

Résumé

We propose a pipeline to collect, visualize, annotate and analyze motion capture (mocap) data for gesture studies. A pipeline is "an implementation of a workflow specification. The term comes from computing, where it means a set of serial processes, with the output of one process being the input of the subsequent process. A production pipeline is not generally perfectly serial because real workflows usually have branches and iterative loops, but the idea is valid: A pipeline is the set of procedures that need to be taken in order to create and hand off deliverables" (Okun, 2010). The pipeline designed here (table 1) presents two main parts and three subparts. The first part of the pipeline describes the data collection process, including the setup and its prerequisites, the protocol to follow and how to export data regarding the analysis. The second part is focusing on data analysis, describing the main steps of Data processing, Data analysis itself following different gesture descriptors, and Data visualization in order to understand complex or multidimensional gesture features. We design the pipeline using blocks connected with arrows. Each block is presenting a specific step using hardware or software. Arrows represent the flow of data between each block and the 3-letter acronyms attached refer to the data file format (table 2). The development of the pipeline raises three main questions: How to synchronize Data ? How to pick data and transform it? And what is changing in annotation? To solve the question of data synchronization, we design a protocol where we detail how hardware has to be properly selected regarding the type of measures and the protocol to follow implies specific steps for the participant such as adopting a T-Pose, or clapping their hands once at the beginning and the end of the recording to facilitate data synchronization and export in the next steps. About picking relevant data and transforming it, we propose to select and prepare files to export regarding the analysis and software expected. Thus, mocap files could be converted to videos to be visualized for instance in Elan to enhance gesture coding or converted to text files to be analyzed in Excel, or processed in Unity to explore the flow of movement, new gesture features or kinematics. We detail all these processes in a step-by-step tutorial available in an open access. Finally, we question what a pipeline involving Mocap is changing in annotation. We notice mocap allows no more a single point a view but as many as required since we can use virtual camera to study gesture from the "skeleton" of the participant. For example, we show it is possible to adopt a first-person point of view to embody then better understand participants gestures. We also propose an augmented reality tool developed in the Unity3D software to visualize in real-time multidimensional gesture features (such as velocity, acceleration, jerk) or a combination of them in simpler curve or surface representations. As future direction, data collected here could be used for a machine learning algorithm in order to extract automatically gesture properties or automatically detect and tag aspectuality of gestures. At last, an embodied visualization tool using virtual reality could thus offer newer possibilities to code and understand gestures than using a 2D video as a reference or study material.
Fichier principal
Vignette du fichier
POLIMOD 2017-18 - PIPELINE Documentation.pdf (3.02 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01950466 , version 1 (10-12-2018)

Identifiants

  • HAL Id : hal-01950466 , version 1

Citer

Dominique Boutet, Jean-François Jégo, Vincent Meyrueis. POLIMOD Pipeline: documentation. Motion Capture, Visualization & Data Analysis for gesture studies. [Research Report] Université de Rouen, Université Paris 8, Moscow State Linguistic University. 2018. ⟨hal-01950466⟩
127 Consultations
127 Téléchargements

Partager

Gmail Facebook X LinkedIn More