Multi-Modal Intention Prediction With Probabilistic Movement Primitives

Oriane Dermy 1 François Charpillet 1 Serena Ivaldi 1
1 LARSEN - Lifelong Autonomy and interaction skills for Robots in a Sensing ENvironment
Inria Nancy - Grand Est, LORIA - AIS - Department of Complex Systems, Artificial Intelligence & Robotics
Abstract : This paper proposes a method for multi-modal prediction of intention based on a probabilistic description of movement primitives and goals. We target dyadic interaction between a human and a robot in a collaborative scenario. The robot acquires multi-modal models of collaborative action primitives containing gaze cues from the human partner and kinetic information about the manipulation primitives of its arm. We show that if the partner guides the robot with the gaze cue, the robot recognizes the intended action primitive even in the case of ambiguous actions. Furthermore, this prior knowledge acquired by gaze greatly improves the prediction of the future intended trajectory during a physical interaction. Results with the humanoid iCub are presented and discussed.
Complete list of metadatas

Cited literature [28 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01644585
Contributor : Oriane Dermy <>
Submitted on : Wednesday, November 22, 2017 - 2:21:52 PM
Last modification on : Tuesday, December 18, 2018 - 4:40:22 PM

File

multimodal_prompV2.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01644585, version 1

Citation

Oriane Dermy, François Charpillet, Serena Ivaldi. Multi-Modal Intention Prediction With Probabilistic Movement Primitives. HFR 2017 - 10th International Workshop on Human-Friendly Robotics, Nov 2017, Napoli, Italy. pp.1-15. ⟨hal-01644585⟩

Share

Metrics

Record views

230

Files downloads

837