Direct Iterative Closest Point for Real-time Visual Odometry

Abstract : In RGB-D sensor based visual odometry the goal is to estimate a sequence of camera movements using image and/or range measurements. Direct methods solve the problem by minimizing intensity error. In this work a depth map obtained from a RGB-D sensor is considered as a new measurement which is combined with a direct photometric cost function. The minimization of the bi-objective cost function produces 3D camera motion parameters which registers two 3D surfaces within a same coordinate system. The given formulation does not require any predetermined temporal correspondencies nor feature extraction when having a sufficient frame rate. It is shown how incorporating the depth measurement robustifies the cost function in case of insufficient texture information and non-Lambertian surfaces. Finally the method is demonstrated in the Planetary Robotics Vision Ground Processing (PRoVisG) competition where visual odometry and 3D reconstruction results are solved for a stereo image sequence captured using a Mars rover.
Type de document :
Communication dans un congrès
The Second international Workshop on Computer Vision in Vehicle Technology: From Earth to Mars in conjunction with the International Conference on Computer Vision, 2011, Barcelona, Spain. 2011
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01357373
Contributeur : Andrew Comport <>
Soumis le : lundi 29 août 2016 - 16:35:34
Dernière modification le : mardi 30 août 2016 - 01:04:54

Identifiants

  • HAL Id : hal-01357373, version 1

Collections

Citation

Tommy Tykkala, Cedric Audras, Andrew I. Comport. Direct Iterative Closest Point for Real-time Visual Odometry. The Second international Workshop on Computer Vision in Vehicle Technology: From Earth to Mars in conjunction with the International Conference on Computer Vision, 2011, Barcelona, Spain. 2011. <hal-01357373>

Partager

Métriques

Consultations de la notice

39