Deep Sensor Fusion for Real-Time Odometry Estimation - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Deep Sensor Fusion for Real-Time Odometry Estimation

Michelle Valente
  • Fonction : Auteur
  • PersonId : 1016374
Cyril Joly

Résumé

Cameras and 2D laser scanners, in combination, are able to provide low-cost, lightweight and accurate solutions, which make their fusion well-suited for many robot navigation tasks. However, correct data fusion depends on precise calibration of the rigid body transform between the sensors. In this paper we present the first framework that makes use of Convolutional Neural Networks (CNNs) for odometry estimation fusing 2D laser scanners and mono-cameras. The use of CNNs provides the tools to not only extract the features from the two sensors, but also to fuse and match them without needing a calibration between the sensors. We transform the odometry estimation into an ordinal classification problem in order to find accurate rotation and translation values between consecutive frames. Results on a real road dataset show that the fusion network runs in real-time and is able to improve the odometry estimation of a single sensor alone by learning how to fuse two different types of data information.
Fichier principal
Vignette du fichier
1908.00524.pdf (4.54 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02399460 , version 1 (09-12-2019)

Identifiants

  • HAL Id : hal-02399460 , version 1

Citer

Michelle Valente, Cyril Joly, Arnaud de La Fortelle. Deep Sensor Fusion for Real-Time Odometry Estimation. IEEE/RSJ International Conference on Intelligent Robots and Systems, Nov 2019, Macau, China. ⟨hal-02399460⟩
85 Consultations
148 Téléchargements

Partager

Gmail Facebook X LinkedIn More