Skip to Main content Skip to Navigation
Conference papers

Fully automatic extrinsic calibration of RGB-D system using two views of natural scene

Abstract : RGB-D sensor, like low cost Kinect, are widely used in robotics applications. Obstacle Avoidance (OA), Simultaneous Localization And Mapping (SLAM), Mobile Object Tracking (MOT) all are needing accurate information about the position of objects in the environment. 3D cameras are really convenient to realize those tasks but as low cost sensors, they have to be completed by other sensors:cameras, laser range finders, US or IR telemeters. In order to exploit all data sensors in a same algorithm, we have to express these data in a common reference frame. In other words we have to know the rigid transformation between sensor frames. In this paper, we propose a new method to retrieve rigid transformation (known as extrinsic parameters in calibration process) between a depth camera and a conventional camera. We show that such a method is accurate enough without the need of an user interaction nor a special calibration pattern unlike other common calibration processes.
Complete list of metadata
Contributor : Frédéric Davesne Connect in order to contact the contributor
Submitted on : Monday, April 27, 2015 - 11:54:04 AM
Last modification on : Saturday, April 16, 2022 - 3:37:56 AM



Jean-Clement Devaux, Hicham Hadj-Abdelkader, Etienne Colle. Fully automatic extrinsic calibration of RGB-D system using two views of natural scene. 13th International Conference on Control Automation Robotics and Vision (ICARCV 2014), Dec 2014, Singapour, Singapore. pp.894--900, ⟨10.1109/ICARCV.2014.7064423⟩. ⟨hal-01145890⟩



Record views