Smile and laughter detection for elderly people-robot interaction

Abstract : Affect bursts play an important role in non-verbal social interaction. Laughter and smile are some of the most important social markers in human-robot social interaction. Not only do they contain affective information, they also may reveal the user's communication strategy. In the context of human robot interaction, an automatic laughter and smile detection system may thus help the robot to adapt its behavior to a given user's profile by adopting a more relevant communication scheme. While many interesting works on laughter and smile detection have been done, only few of them focused on elderly people. Elderly people data are relatively rare and often carry a significant challenge to a laughter and smile detection system due to face wrinkles and an often lower voice quality. In this paper, we address laughter and smile detection in the ROMEO2 corpus, a multimodal (audio and video) corpus of elderly people-robot interaction. We show that, while a single modality yields a given performance, a fair improvement can be reached by combining the two modalities.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01836445
Contributor : Limsi Publications <>
Submitted on : Thursday, July 12, 2018 - 12:30:01 PM
Last modification on : Saturday, May 4, 2019 - 1:20:47 AM

Identifiers

  • HAL Id : hal-01836445, version 1

Citation

Fan Yang, Mohamed El Amine Sehili, Claude Barras, Laurence Devillers. Smile and laughter detection for elderly people-robot interaction. International Conference on Social Robotics, Jan 2015, Paris, France. ⟨hal-01836445⟩

Share

Metrics

Record views

28