Sensor-based Navigation of Omnidirectional Wheeled Robots Dealing with both Collisions and Occlusions

Abstract : Navigation tasks are often subject to several constraints that can be related to the sensors (visibility) or come from the environment (obstacles). In this paper, we propose a framework for autonomous omnidirectional vehicles, that takes into account both collision and occlusion risk, during sensor-based navigation. The task consists in driving the vehicle towards a visual target in the presence of static and moving obstacles. The target is acquired by fixed - limited field of view - on-board sensors, while the surrounding obstacles are detected by lidar scanners. To perform the task, the vehicle has not only to keep the target in view while avoiding the obstacles, but also to predict its location in the case of occlusion. The effectiveness of our approach is validated through several experiments.
Document type :
Journal articles
Complete list of metadatas

Cited literature [25 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01625946
Contributor : Andrea Cherubini <>
Submitted on : Monday, May 20, 2019 - 4:40:43 PM
Last modification on : Wednesday, June 19, 2019 - 2:29:44 PM

File

Sensors_Based_Navigation_VF.pd...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01625946, version 5

Collections

Citation

Abdellah Khelloufi, Nouara Achour, Robin Passama, Andrea Cherubini. Sensor-based Navigation of Omnidirectional Wheeled Robots Dealing with both Collisions and Occlusions. Robotica, Cambridge University Press, 2019. ⟨hal-01625946v5⟩

Share

Metrics

Record views

36

Files downloads

33