Active Multisensory Perception and LearnIng For InteractivE Robots

Mathieu Lefort 1 Jean-Charles Quinton 2 Marie Avillac 3 Adrien Techer 3, 1
1 SMA - Systèmes Multi-Agents
LIRIS - Laboratoire d'InfoRmatique en Image et Systèmes d'information
2 SVH - Statistique pour le Vivant et l’Homme
LJK - Laboratoire Jean Kuntzmann
Abstract : The AMPLIFIER (Active Multisensory Perception and LearnIng For InteractivE Robots) project (2018-2022) will study how multisensory fusion and active perception can influence each other during the developmental sensori-motor loop of an autonomous agent. Psychophysics experiments will provide insights on how active perception may influence multisensory fusion in human. Using neural fields, a multi-scale computational neuroscience paradigm, we want to model the behavioral observations in order to transfer and to extend the extracted functional properties to social robots. Especially, we target to provide more natural interactions with humans by allowing the robot to have a better understanding and more appropriate contextual reactions to its environment.
Complete list of metadatas

Cited literature [20 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01839427
Contributor : Jean-Charles Quinton <>
Submitted on : Saturday, July 14, 2018 - 7:30:18 PM
Last modification on : Thursday, November 21, 2019 - 2:29:47 AM
Long-term archiving on : Tuesday, October 16, 2018 - 1:33:17 AM

File

ICDL_17___Multimodality.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01839427, version 1

Citation

Mathieu Lefort, Jean-Charles Quinton, Marie Avillac, Adrien Techer. Active Multisensory Perception and LearnIng For InteractivE Robots. Workshop on Computational Models for Crossmodal Learning - IEEE ICDL-EPIROB, Sep 2017, Lisbon, Portugal. pp.2. ⟨hal-01839427⟩

Share

Metrics

Record views

219

Files downloads

93