Deep Networks with Adaptive Nyström Approximation

Abstract : Recent work has focused on combining kernel methods and deep learning to exploit the best of the two approaches. Here, we introduce a new architecture of neural networks in which we replace the top dense layers of standard convolutional architectures with an approximation of a kernel function by relying on the Nyström approximation. Our approach is easy and highly flexible. It is compatible with any kernel function and it allows exploiting multiple kernels. We show that our architecture has the same performance than standard architecture on datasets like SVHN and CIFAR100. One benefit of the method lies in its limited number of learnable parameters which makes it particularly suited for small training set sizes, e.g. from 5 to 20 samples per class.
Complete list of metadatas

Cited literature [32 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02091661
Contributor : Luc Giffon <>
Submitted on : Wednesday, November 27, 2019 - 4:52:40 PM
Last modification on : Sunday, December 1, 2019 - 1:26:55 AM

Identifiers

  • HAL Id : hal-02091661, version 2
  • ARXIV : 1911.13036

Collections

Citation

Luc Giffon, Stéphane Ayache, Thierry Artières, Hachem Kadri. Deep Networks with Adaptive Nyström Approximation. IJCNN 2019 - International Joint Conference on Neural Networks, Jul 2019, Budapest, Hungary. ⟨hal-02091661v2⟩

Share

Metrics

Record views

47

Files downloads

21