Skip to Main content Skip to Navigation
Conference papers

Deep Networks with Adaptive Nyström Approximation

Abstract : Recent work has focused on combining kernel methods and deep learning to exploit the best of the two approaches. Here, we introduce a new architecture of neural networks in which we replace the top dense layers of standard convolutional architectures with an approximation of a kernel function by relying on the Nyström approximation. Our approach is easy and highly flexible. It is compatible with any kernel function and it allows exploiting multiple kernels. We show that our architecture has the same performance than standard architecture on datasets like SVHN and CIFAR100. One benefit of the method lies in its limited number of learnable parameters which makes it particularly suited for small training set sizes, e.g. from 5 to 20 samples per class.
Complete list of metadatas

Cited literature [32 references]  Display  Hide  Download
Contributor : Luc Giffon <>
Submitted on : Wednesday, November 27, 2019 - 4:52:40 PM
Last modification on : Thursday, January 23, 2020 - 6:22:13 PM
Long-term archiving on: : Friday, February 28, 2020 - 1:57:06 PM


  • HAL Id : hal-02091661, version 2
  • ARXIV : 1911.13036



Luc Giffon, Stéphane Ayache, Thierry Artières, Hachem Kadri. Deep Networks with Adaptive Nyström Approximation. IJCNN 2019 - International Joint Conference on Neural Networks, Jul 2019, Budapest, Hungary. ⟨hal-02091661v2⟩



Record views


Files downloads