Robot–Robot Gesturing for Anchoring Representations - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Robotics Année : 2018

Robot–Robot Gesturing for Anchoring Representations

Résumé

In a multirobot system, using shared symbols for objects in the environment is a prerequisite for collaboration. Sharing symbols requires that each agent has anchored a symbol with an internal, sensor level representation, as well as that these symbols match between the agents. The problem can be solved easily when the internal representations can be communicated between the agents. However, with heterogeneous embodiments the available sensors are likely to differ, making it impossible to share the internal representations directly. We propose the use of pointing gestures to align symbols between a heterogeneous group of robots. We describe a planning framework that minimizes the required effort for anchoring representations across robots. The framework allows planning for both the gesturing and observing agents in a decentralized fashion. It considers both implicit sources of failure, such as ambiguous pointing, as well as costs required by actions. Simulation experiments demonstrate that the resulting planning problem has a complex solution structure with multiple local minima. Demonstration with a heterogeneous two-robot system shows the practical viability of this approach.

Dates et versions

hal-01961433 , version 1 (19-12-2018)

Identifiants

Citer

Polychronis Kondaxakis, Khurram Gulzar, Stefan Kinauer, Iasonas Kokkinos, Ville Kyrki. Robot–Robot Gesturing for Anchoring Representations. IEEE Transactions on Robotics, 2018, pp.1-15. ⟨10.1109/TRO.2018.2875388⟩. ⟨hal-01961433⟩
225 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More