Deep Networks with Adaptive Nyström Approximation

Abstract : Recent work has focused on combining kernel methods and deep learning to exploit the best of the two approaches. Here, we introduce a new architecture of neural networks in which we replace the top dense layers of standard convolutional architectures with an approximation of a kernel function by relying on the Nyström approximation. Our approach is easy and highly flexible. It is compatible with any kernel function and it allows exploiting multiple kernels. We show that our architecture has the same performance than standard architecture on datasets like SVHN and CIFAR100. One benefit of the method lies in its limited number of learnable parameters which makes it particularly suited for small training set sizes, e.g. from 5 to 20 samples per class.
Complete list of metadatas

Cited literature [12 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02091661
Contributor : Luc Giffon <>
Submitted on : Friday, April 5, 2019 - 10:45:49 PM
Last modification on : Monday, May 27, 2019 - 1:20:13 AM
Long-term archiving on : Saturday, July 6, 2019 - 4:33:03 PM

File

Deepstr_mIJCNN___etat_1ere_sou...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02091661, version 1

Citation

Luc Giffon, Stéphane Ayache, Thierry Artières, Hachem Kadri. Deep Networks with Adaptive Nyström Approximation. IJCNN 2019 - International Joint Conference on Neural Networks, Jul 2019, Budapest, Hungary. ⟨hal-02091661⟩

Share

Metrics

Record views

129

Files downloads

112