Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events

Abstract : Representing objects in space is difficult because sensorimotor events are anchored in different reference frames, which can be either eye-, armor or target-centered. In the brain, Gain-Field (GF) neurons in the parietal cortex are involved in computing the necessary spatial transformations for aligning the tactile, visual and proprioceptive signals. In reaching tasks, these GF neurons exploit a mechanism based on multiplicative interaction for binding simultaneously touched events from the hand with visual and proprioception information.By doing so, they can infer new reference frames to represent dynamically the location of the body parts in the visual space (i.e., the body schema) and nearby targets (i.e., its peripersonal space). In this line, we propose a neural model based on GF neurons for integrating tactile events with arm postures and visual locations for constructing hand-and target-centered receptive fields in the visual space. In robotic experiments using an artificial skin, we show how our neural architecture reproduces the behaviors of parietal neurons (1) for encoding dynamically the body schema of our robotic arm without any visual tags on it and (2) for estimating the relative orientation and distance of targets to it. We demonstrate how tactile information facilitates the integration of visual and proprioceptive signals in order to construct the body space.
Document type :
Journal articles
Complete list of metadatas
Contributor : Alexandre Pitti <>
Submitted on : Wednesday, March 20, 2019 - 12:52:21 PM
Last modification on : Thursday, June 6, 2019 - 9:50:04 PM


Publisher files allowed on an open archive



Ganna Pugach, Alexandre Pitti, Olga Tolochko, Philippe Gaussier. Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events. Frontiers in Neurorobotics, Frontiers, 2019, 13, ⟨10.3389/fnbot.2019.00005⟩. ⟨hal-02073946⟩



Record views


Files downloads