Deep Sensor Fusion for Real-Time Odometry Estimation

Abstract : Cameras and 2D laser scanners, in combination, are able to provide low-cost, lightweight and accurate solutions, which make their fusion well-suited for many robot navigation tasks. However, correct data fusion depends on precise calibration of the rigid body transform between the sensors. In this paper we present the first framework that makes use of Convolutional Neural Networks (CNNs) for odometry estimation fusing 2D laser scanners and mono-cameras. The use of CNNs provides the tools to not only extract the features from the two sensors, but also to fuse and match them without needing a calibration between the sensors. We transform the odometry estimation into an ordinal classification problem in order to find accurate rotation and translation values between consecutive frames. Results on a real road dataset show that the fusion network runs in real-time and is able to improve the odometry estimation of a single sensor alone by learning how to fuse two different types of data information.
Document type :
Conference papers
Complete list of metadatas

Cited literature [30 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02399460
Contributor : Michelle Valente <>
Submitted on : Monday, December 9, 2019 - 10:05:32 AM
Last modification on : Wednesday, January 8, 2020 - 1:54:33 AM

File

1908.00524.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02399460, version 1

Citation

Michelle Valente, Cyril Joly, Arnaud de la Fortelle. Deep Sensor Fusion for Real-Time Odometry Estimation. IEEE/RSJ International Conference on Intelligent Robots and Systems, Nov 2019, Macau, China. ⟨hal-02399460⟩

Share

Metrics

Record views

17

Files downloads

23