On Kernel Derivative Approximation with Random Fourier Features - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2018

On Kernel Derivative Approximation with Random Fourier Features

Résumé

Random Fourier features (RFF) represent one of the most popular and widespread techniques in machine learning to scale up kernel algorithms. Despite the numerous successful applications of RFFs, unfortunately, quite little is understood theoretically on their optimality and limitations of their performance. To the best of our knowledge, the only existing areas where precise statistical-computational trade-offs have been established are approximation of kernel values, kernel ridge regression, and kernel principal component analysis. Our goal is to spark the investigation of optimality of RFF-based approximations in tasks involving not only function values but derivatives, which naturally lead to optimization problems with kernel derivatives. Particularly, in this paper, we focus on the approximation quality of RFFs for kernel derivatives and prove that the existing finite-sample guarantees can be improved exponentially in terms of the domain where they hold, using recent tools from unbounded empirical process theory. Our result implies that the same approximation guarantee is achievable for kernel derivatives using RFF as for kernel values.
Fichier principal
Vignette du fichier
RFFD.pdf (434.79 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01897996 , version 1 (17-10-2018)
hal-01897996 , version 2 (02-11-2018)
hal-01897996 , version 3 (09-02-2019)

Identifiants

  • HAL Id : hal-01897996 , version 1

Citer

Zoltán Szabó, Bharath K Sriperumbudur. On Kernel Derivative Approximation with Random Fourier Features. 2018. ⟨hal-01897996v1⟩
194 Consultations
47 Téléchargements

Partager

Gmail Facebook X LinkedIn More