Budget Restricted Incremental Learning with Pre-Trained Convolutional Neural Networks and Binary Associative Memories - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2017

Budget Restricted Incremental Learning with Pre-Trained Convolutional Neural Networks and Binary Associative Memories

Résumé

Thanks to their ability to absorb large amounts of data, Convolutional Neural NetThanks to their ability to absorb large amounts of data, Convolutional Neural Networks (CNNs) have become state- of-the-art in numerous vision challenges, sometimes even on par with biological vision. They rely on optimization routines that typically require intensive computational power, thus the question of embedded architectures is a very active field of research. Of particular interest is the problem of incremental learning, where the device adapts to new observations or classes. To tackle this challenging problem, we propose to combine pre-trained CNNs with binary associative memories, using product random sampling as an intermediate between the two methods. The obtained architecture requires significantly less computational power and memory usage than existing counterparts. Moreover, using various challenging vision datasets we show that the proposed architecture is able to perform one-shot learning - and even use only a small portion of the dataset - while keeping very good accuracy.works (CNNs) have become state- of-the-art in numerous vision challenges, sometimes even on par with biological vision. They rely on optimisation routines that typically require intensive computational power, thus the question of embedded architectures is a very active field of research. Of particular interest is the problem of incremental learning, where the device adapts to new observations or classes. To tackle this challenging problem, we propose to combine pre-trained CNNs with binary associative memories, using product random sampling as an intermediate between the two methods. The obtained architecture requires significantly less computational power and memory usage than existing counterparts. Moreover, using various challenging vision datasets we show that the proposed architecture is able to perform one-shot learning - and even use only a small portion of the dataset - while keeping very good accuracy.
Fichier principal
Vignette du fichier
SIPS.pdf (240.06 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01656152 , version 1 (26-07-2018)

Identifiants

Citer

Ghouthi Boukli Hacene, Vincent Gripon, Nicolas Farrugia, Matthieu Arzel, Michel Jezequel. Budget Restricted Incremental Learning with Pre-Trained Convolutional Neural Networks and Binary Associative Memories. SIPS 2017 : IEEE International Workshop on Signal Processing Systems, Oct 2017, Lorient, France. pp.1 - 4, ⟨10.1109/SiPS.2017.8109978⟩. ⟨hal-01656152⟩
654 Consultations
1565 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More