Asynchronous visual event-based time-to-contact - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Frontiers in Neuroscience Année : 2014

Asynchronous visual event-based time-to-contact

Résumé

Reliable and fast sensing of the environment is a fundamental requirement for autonomous mobile robotic platforms. Unfortunately, the frame-based acquisition paradigm at the basis of main stream artificial perceptive systems is limited by low temporal dynamics and redundant data flow, leading to high computational costs. Hence, conventional sensing and relative computation are obviously incompatible with the design of high speed sensor-based reactive control for mobile applications, that pose strict limits on energy consumption and computational load. This paper introduces a fast obstacle avoidance method based on the output of an asynchronous event-based time encoded imaging sensor. The proposed method relies on an event-based Time To Contact (TTC) computation based on visual event-based motion flows. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. The method is validated indoor on a mobile robot, comparing the event-based TTC with a laser range finder TTC, showing that event-based sensing offers new perspectives for mobile robotics sensing.
Fichier principal
Vignette du fichier
fnins-08-00009.pdf (2.32 Mo) Télécharger le fichier
Origine : Publication financée par une institution
Loading...

Dates et versions

hal-01324452 , version 1 (01-06-2016)

Licence

Paternité

Identifiants

Citer

Xavier Clady, Charles Clercq, Sio-Hoi Ieng, Fouzhan Houseini, Marco Randazzo, et al.. Asynchronous visual event-based time-to-contact. Frontiers in Neuroscience, 2014, 8, pp.9. ⟨10.3389/fnins.2014.00009⟩. ⟨hal-01324452⟩
114 Consultations
133 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More