Embedded vision-based localization and model predictive control for autonomous exploration
Résumé
This paper presents a complete mobile robot architecture for autonomous exploration in a GPS-denied unknown environment. The platform considered is equipped with wheel encoders, stereo-vision and depth sensors, the measurements of which are fused within an extended Kalman filter for robust localization. An occupancy grid of the environment is built on-line for environment reconstruction and obstacle detection. Based on this map, a model predictive control scheme autonomously defines safe exploration trajectories, while taking into account interaction with the imaging sensors. Experimental results demonstrate the embedded computational capability of this vision-based control loop.
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Loading...