Self-calibrating smooth pursuit through active efficient coding

Abstract : This paper presents a model for the autonomous learning of smooth pursuit eye movements based on an efficient coding criterion for active perception. This model accounts for the joint development of visual encoding and eye control. Sparse coding models encode the incoming data at two different spatial resolu-tions and capture the statistics of the input in spatio-temporal basis functions. A reinforcement learner controls eye velocity so as to maximize a reward signal based on the efficiency of the encoding. We consider the embodiment of the approach in the iCub simulator and real robot.Motion perception and smooth pursuit control are not explicitly expressed as tasks for the robot to achieve but emerge as the result of the system's active attempt to efficiently encode its sensory inputs. Experiments demonstrate that the proposed approach is self-calibrating and robust to strong perturbations of the perception-action link.
Liste complète des métadonnées

Littérature citée [31 références]  Voir  Masquer  Télécharger

https://hal.archives-ouvertes.fr/hal-01113340
Contributeur : Céline Teulière <>
Soumis le : jeudi 5 février 2015 - 09:56:34
Dernière modification le : jeudi 11 janvier 2018 - 06:24:24
Document(s) archivé(s) le : mercredi 6 mai 2015 - 10:15:15

Fichier

ctRAS-manuscript.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Citation

Céline Teulière, S Forestier, L Lonini, Cong Zhang, Y Zhao, et al.. Self-calibrating smooth pursuit through active efficient coding. Robotics and Autonomous Systems, Elsevier, 2014, http://dx.doi.org/10.1016/j.robot.2014.11.006. 〈http://www.sciencedirect.com/science/article/pii/S0921889014002486〉. 〈10.1016/j.robot.2014.11.006〉. 〈hal-01113340〉

Partager

Métriques

Consultations de la notice

313

Téléchargements de fichiers

256