The SignAge Corpus: Recording older signers with low cost motion capture devices - Archive ouverte HAL Accéder directement au contenu
Poster De Conférence Année : 2019

The SignAge Corpus: Recording older signers with low cost motion capture devices

Résumé

For almost ten years, the marketing of low cost motion capture devices such as the Microsoft Kinect sensor has enabled numerous studies in real-life settings (Mousavi Hondori & Khademi, 2014; Webster & Celik, 2014; Springer & Yogev Seligmann, 2016). Whereas most of this work with elderly people is studying gait and fall risks (see Rougier, Auvinet, Rousseau, Mignotte, & Meunier, 2011 for example), in the present paper we propose to focus on the building of the SignAge corpus dedicated to the study of signing in elderly deaf participants with low cost motion capture devices. Up to now, a (preferably multi-)camera setup was considered a basic requirement in sign language studies, sometimes completed with much more intrusive or expensive equipment such as data gloves, optical motion capture systems (Channon, 2015, p. 132–133). But latest technology advancements allow us to quantify 3D-motions and their time derivatives at a reasonable price. Our newly built SignAge corpus1 of interactions between elderly deaf signers in LSF takes advantage of such advancements. The SignAge corpus combines data acquired by: - 2 digital video cameras, with a plan on the signing interviewee’s upper body and a plan on the whole interaction (similarly to the protocol developed in CorpAGEst, Bolly & Boutet, 2016) - 2 Noitom Perception Neuron body straps, each equipped with 25 IMU (Inertial Measurement Units), recorded with Axis Neuron software - 1 Kinect for Windows v2 (also known as Kinect for Xbox One) depth sensor centered on the interviewee, recorded with Brekel Pro Body v2. Thus, for each participant, our 5 timed data flows are: 2 video streams at 25 fps – synchronised, visualised and annotated in the ELAN software (Sloetjes & Seibert, 2016) –, and BioVision Hierarchy (BVH) files (see Meredith & Maddock, 2001). These BVH files are visualised in Motion Inspector, and the following descriptors are computed in Matlab for the three age groups in our sample: the global quantity of motion (Sarasúa & Guaus, 2014) and the variation of the signing amplitude for each joint of the upper limbs, with the attempt to establish a correlation between age and the articulatory segment involved. For now, after manually post-synchronising our flows thanks to start and end claps realised by both the interviewer and the interviewee during the recording session, we are assessing the quality of our data. Preliminary results show that temporal resolution can be questionable for the Kinect used in conjunction with Brekel, whereas in the Perception Neuron, spatial drift is a problem. Hence, these two devices seem to be complementary: the Kinect gives the absolute movements of the body, whereas the neuron is much precise for relative movements. Furthermore, a post interview feedback with each subject showed a high acceptance of the protocol by elderly signers (see figure below). As a conclusion, knowing the technical limitations it appears that Kinect and Neuron might still be an interesting choice to get usable additional 3D data in aging studies because of their portability and the ease with which participants get accustomed to wearing the body straps.
Fichier principal
Vignette du fichier
Poster2CLARe_V3_web.pdf (500.53 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02065452 , version 1 (13-03-2019)

Identifiants

  • HAL Id : hal-02065452 , version 1

Citer

Coralie Vincent, Fanny Catteau, Dominique Boutet, Marion Blondel. The SignAge Corpus: Recording older signers with low cost motion capture devices. Corpora for Language and Aging Research (CLARe 4), Feb 2019, Helsinki, Finland. ⟨hal-02065452⟩
204 Consultations
131 Téléchargements

Partager

Gmail Facebook X LinkedIn More