Fusing Event-based and RGB camera for Robust Object Detection in Adverse Conditions - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Fusing Event-based and RGB camera for Robust Object Detection in Adverse Conditions

Résumé

The ability to detect objects, under image corruptions and different weather conditions is vital for deep learning models especially when applied to real-world applications such as autonomous driving. Traditional RGB-based detection fails under these conditions and it is thus important to design a sensor suite that is redundant to failures of the primary frame-based detection. Event-based cameras can complement frame-based cameras in low-light conditions and high dynamic range scenarios that an autonomous vehicle can encounter during navigation. Accordingly, we propose a redundant sensor fusion model of event-based and frame-based cameras that is robust to common image corruptions. The method utilizes a voxel grid representation for events as input and proposes a two-parallel feature extractor network for frames and events. Our sensor fusion approach is more robust to corruptions by over 30% compared to only frame-based detections and outperforms the only event-based detection. The model is trained and evaluated on the publicly released DSEC dataset.
Fichier principal
Vignette du fichier
Event_camera_sensor_fusion.pdf (1.3 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03591717 , version 1 (30-03-2022)

Identifiants

  • HAL Id : hal-03591717 , version 1

Citer

Abhishek Tomy, Anshul Paigwar, Khushdeep Singh Mann, Alessandro Renzaglia, Christian Laugier. Fusing Event-based and RGB camera for Robust Object Detection in Adverse Conditions. ICRA 2022 - IEEE International Conference on Robotics and Automation, May 2022, Philadelphia, United States. ⟨hal-03591717⟩
604 Consultations
1329 Téléchargements

Partager

Gmail Facebook X LinkedIn More