An Evaluation Framework to Assess and Correct the Multimodal Behavior of a Humanoid Robot in Human-Robot Interaction

Duc Canh Nguyen 1 Gérard Bailly 1 Frédéric Elisei 2, 1
1 GIPSA-CRISSP - CRISSP
GIPSA-DPC - Département Parole et Cognition
2 GIPSA-Services - GIPSA-Services
GIPSA-lab - Grenoble Images Parole Signal Automatique
Abstract : We discuss here the key features of a new methodology that enables professional caregivers to teach a socially assistive robot (SAR) how to perform the assistive tasks while giving verbal and coverbal instructions, demonstrations and feedbacks. We describe here how socio-communicative gesture controllers – which actually control the speech, the facial displays and hand gestures of our iCub robot – are driven by multimodal events captured on a professional human demonstrator performing a neuropsychological interview. The paper focuses on the results of two crowd-sourced experiments where we asked raters to evaluate the multimodal interactive behaviors of our SAR. We demonstrate that this framework allows decreasing the behavioral errors of our robot. We also show that human expectations of functional capabilities increase with the quality of its performative behaviors.
Complete list of metadatas

Cited literature [18 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01578713
Contributor : Gérard Bailly <>
Submitted on : Tuesday, August 29, 2017 - 4:11:41 PM
Last modification on : Monday, April 9, 2018 - 12:22:50 PM

File

Gespin_paper_vf.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01578713, version 1

Collections

Citation

Duc Canh Nguyen, Gérard Bailly, Frédéric Elisei. An Evaluation Framework to Assess and Correct the Multimodal Behavior of a Humanoid Robot in Human-Robot Interaction. GEstures and SPeech in INteraction (GESPIN), Aug 2017, Posnan, Poland. ⟨hal-01578713⟩

Share

Metrics

Record views

440

Files downloads

285