FPGA-based bio-inspired architecture for multi-scale attentional vision
Résumé
Attention-based bio-inspired vision can be studied as a different way to consider sensor processing, firstly allowing to reduce the amount of data transmitted by connected cameras and secondly advocating a paradigm shift toward neuro-inspired processing for the post-processing of the few regions extracted from the visual field. The computational complexity of the corresponding vision models leads us to follow an in-sensor approach in the context of embedded systems. We propose in this paper an attention-based smart-camera which extracts salient features based on retina receptive fields at multiple scales and in real-time thanks to a dedicated hardware architecture. The results show that the entire visual chain can be embedded into a FPGA-SoC device delivering up to 60 frames per second. The features provided by the smart-camera can then be learned by external neural networks in order to accomplish various applications.
Mots clés
SoC
field programmable gate arrays
system-on-chip
computer vision
FPGA
Computer architecture
Cameras
sensor processing
visual chain
attention-based bio-inspired vision
attention-based smart-camera
bio-inspired architecture
field programmable gate array
hardware architecture
multiscale attentional vision
neural networks
retina receptive fields
salient feature extraction
Feature extraction
Hardware
Real-time systems
Robots
Visualization