Skip to Main content Skip to Navigation
Conference papers

Direct Iterative Closest Point for Real-time Visual Odometry

Abstract : In RGB-D sensor based visual odometry the goal is to estimate a sequence of camera movements using image and/or range measurements. Direct methods solve the problem by minimizing intensity error. In this work a depth map obtained from a RGB-D sensor is considered as a new measurement which is combined with a direct photometric cost function. The minimization of the bi-objective cost function produces 3D camera motion parameters which registers two 3D surfaces within a same coordinate system. The given formulation does not require any predetermined temporal correspondencies nor feature extraction when having a sufficient frame rate. It is shown how incorporating the depth measurement robustifies the cost function in case of insufficient texture information and non-Lambertian surfaces. Finally the method is demonstrated in the Planetary Robotics Vision Ground Processing (PRoVisG) competition where visual odometry and 3D reconstruction results are solved for a stereo image sequence captured using a Mars rover.
Complete list of metadatas
Contributor : Andrew Comport <>
Submitted on : Monday, August 29, 2016 - 4:35:34 PM
Last modification on : Tuesday, May 26, 2020 - 6:50:35 PM


  • HAL Id : hal-01357373, version 1



Tommy Tykkala, Cedric Audras, Andrew I. Comport. Direct Iterative Closest Point for Real-time Visual Odometry. The Second international Workshop on Computer Vision in Vehicle Technology: From Earth to Mars in conjunction with the International Conference on Computer Vision, 2011, Barcelona, Spain. ⟨hal-01357373⟩



Record views