Learning joint reconstruction of hands and manipulated objects - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Learning joint reconstruction of hands and manipulated objects

Résumé

Estimating hand-object manipulations is essential for interpreting and imitating human actions. Previous work has made significant progress towards reconstruction of hand poses and object shapes in isolation. Yet, reconstructing hands and objects during manipulation is a more challenging task due to significant occlusions of both the hand and object. While presenting challenges, manipulations may also simplify the problem since the physics of contact restricts the space of valid hand-object configurations. For example, during manipulation, the hand and object should be in contact but not interpenetrate. In this work, we regularize the joint reconstruction of hands and objects with manipulation constraints. We present an end-to-end learnable model that exploits a novel contact loss that favors physically plausible hand-object constellations. Our approach improves grasp quality metrics over baselines, using RGB images as input. To train and evaluate the model, we also propose a new large-scale synthetic dataset, ObMan, with hand-object manipulations. We demonstrate the transferability of ObMan-trained models to real data.
Fichier principal
Vignette du fichier
hassonCVPR2019.pdf (6.01 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02429093 , version 1 (06-01-2020)

Identifiants

Citer

Yana Hasson, Gül Varol, Dimitrios Tzionas, Igor Kalevatykh, Michael J Black, et al.. Learning joint reconstruction of hands and manipulated objects. CVPR 2019 - IEEE Conference on Computer Vision and Pattern Recognition, Jun 2019, Long Beach, United States. pp.11799-11808, ⟨10.1109/CVPR.2019.01208⟩. ⟨hal-02429093⟩
172 Consultations
147 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More