Depth from motion algorithm and hardware architecture for smart cameras

Abstract : Applications such as autonomous navigation, robot vision, autonomous flying, etc., require 1 depth map information of the scene. Depth can be estimated by using a single moving camera 2 (depth from motion). However, traditional depth from motion algorithms have low processing speed 3 and high hardware requirements that limits the embedded capabilities. In this work, we propose 4 a hardware architecture for depth from motion that consists of a flow/depth transformation and 5 a new optical flow algorithm. Our optical flow formulation consists in an extension of the stereo 6 matching problem. A pixel-parallel/window-parallel approach where a correlation function based in 7 the Sum of Absolute Differences computes the optical flow is proposed. Further, in order to improve 8 the Sum of Absolute Differences performance, the curl of the intensity gradient as preprocessing step 9 is proposed. Experimental results demonstrated that it is possible to reach higher accuracy (90% of 10 accuracy) compared with previous FPGA-based optical flow algorithms. For the depth estimation, 11 our algorithm delivers dense maps with motion and depth information on all the image pixels, with 12 a processing speed up to 128 times faster than previous works and making it possible to achieve high 13 performance in the context of embedded applications. 14
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01964830
Contributeur : Abiel Aguilar-González <>
Soumis le : dimanche 23 décembre 2018 - 21:51:06
Dernière modification le : mardi 8 janvier 2019 - 01:19:05

Fichier

template.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01964830, version 1

Citation

Abiel Aguilar-González, Miguel Arias-Estrada, François Berry. Depth from motion algorithm and hardware architecture for smart cameras. Sensors, MDPI, 2018. 〈hal-01964830〉

Partager

Métriques

Consultations de la notice

8

Téléchargements de fichiers

6