Skip to Main content Skip to Navigation
Conference papers

Deep Sensor Fusion for Real-Time Odometry Estimation

Abstract : Cameras and 2D laser scanners, in combination, are able to provide low-cost, lightweight and accurate solutions, which make their fusion well-suited for many robot navigation tasks. However, correct data fusion depends on precise calibration of the rigid body transform between the sensors. In this paper we present the first framework that makes use of Convolutional Neural Networks (CNNs) for odometry estimation fusing 2D laser scanners and mono-cameras. The use of CNNs provides the tools to not only extract the features from the two sensors, but also to fuse and match them without needing a calibration between the sensors. We transform the odometry estimation into an ordinal classification problem in order to find accurate rotation and translation values between consecutive frames. Results on a real road dataset show that the fusion network runs in real-time and is able to improve the odometry estimation of a single sensor alone by learning how to fuse two different types of data information.
Document type :
Conference papers
Complete list of metadata

Cited literature [30 references]  Display  Hide  Download
Contributor : Michelle Valente Connect in order to contact the contributor
Submitted on : Monday, December 9, 2019 - 10:05:32 AM
Last modification on : Wednesday, November 17, 2021 - 12:31:07 PM
Long-term archiving on: : Tuesday, March 10, 2020 - 4:39:26 PM


Files produced by the author(s)


  • HAL Id : hal-02399460, version 1


Michelle Valente, Cyril Joly, Arnaud de la Fortelle. Deep Sensor Fusion for Real-Time Odometry Estimation. IEEE/RSJ International Conference on Intelligent Robots and Systems, Nov 2019, Macau, China. ⟨hal-02399460⟩



Les métriques sont temporairement indisponibles