Towards a Mixed-Reality framework for autonomous driving - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Towards a Mixed-Reality framework for autonomous driving

Résumé

Testing autonomous driving algorithms on mobile systems in simulation is an essential step to validate the models and train the system for a large set of (possibly unpredictable and critical) situations. Yet, the transfer of the model from simulation to reality is challenging due to the reality gap (i.e., discrepancies between reality and simulation models). Mixed-reality environments enable testing models on real vehicles without taking financial and safety risks. Additionally, it can reduce the development costs of the system by providing faster testing and debugging for mobile robots. This paper proposes a preliminary work towards a mixed-reality framework for autonomous navigation based on RGB-D cameras. The aim is to represent the objects in two environments within a single display using an augmentation strategy. We tested a first prototype by introducing a differential robot able to navigate in its environment, visualize augmented objects and detect them correctly using a pre-trained model based on Faster R-CNN.
Fichier principal
Vignette du fichier
paper1-1.pdf (2.3 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03841285 , version 1 (07-11-2022)

Identifiants

  • HAL Id : hal-03841285 , version 1

Citer

Imane Argui, Maxime Guériau, Samia Ainouz-Zemouche. Towards a Mixed-Reality framework for autonomous driving. ROS 2022 - 13th Workshop on Planning, Perception and Navigation for Intelligent Vehicles, Oct 2022, Kyoto, Japan. ⟨hal-03841285⟩
46 Consultations
24 Téléchargements

Partager

Gmail Facebook X LinkedIn More