Robot–Robot Gesturing for Anchoring Representations

Abstract : In a multirobot system, using shared symbols for objects in the environment is a prerequisite for collaboration. Sharing symbols requires that each agent has anchored a symbol with an internal, sensor level representation, as well as that these symbols match between the agents. The problem can be solved easily when the internal representations can be communicated between the agents. However, with heterogeneous embodiments the available sensors are likely to differ, making it impossible to share the internal representations directly. We propose the use of pointing gestures to align symbols between a heterogeneous group of robots. We describe a planning framework that minimizes the required effort for anchoring representations across robots. The framework allows planning for both the gesturing and observing agents in a decentralized fashion. It considers both implicit sources of failure, such as ambiguous pointing, as well as costs required by actions. Simulation experiments demonstrate that the resulting planning problem has a complex solution structure with multiple local minima. Demonstration with a heterogeneous two-robot system shows the practical viability of this approach.
Type de document :
Article dans une revue
IEEE Transactions on Robotics, IEEE, 2018, pp.1-15. 〈10.1109/TRO.2018.2875388〉
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01961433
Contributeur : Stefan Kinauer <>
Soumis le : mercredi 19 décembre 2018 - 22:15:57
Dernière modification le : jeudi 7 février 2019 - 16:34:19

Lien texte intégral

Identifiants

Citation

Polychronis Kondaxakis, Khurram Gulzar, Stefan Kinauer, Iasonas Kokkinos, Ville Kyrki. Robot–Robot Gesturing for Anchoring Representations. IEEE Transactions on Robotics, IEEE, 2018, pp.1-15. 〈10.1109/TRO.2018.2875388〉. 〈hal-01961433〉

Partager

Métriques

Consultations de la notice

46