Kernel interpolation with continuous volume sampling - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Kernel interpolation with continuous volume sampling

Résumé

A fundamental task in kernel methods is to pick nodes and weights, so as to approximate a given function from an RKHS by the weighted sum of kernel translates located at the nodes. This is the crux of kernel density estimation, kernel quadrature, or interpolation from discrete samples. Furthermore, RKHSs offer a convenient mathematical and computational framework. We introduce and analyse continuous volume sampling (VS), the continuous counterpart -- for choosing node locations -- of a discrete distribution introduced in (Deshpande & Vempala, 2006). Our contribution is theoretical: we prove almost optimal bounds for interpolation and quadrature under VS. While similar bounds already exist for some specific RKHSs using ad-hoc node constructions, VS offers bounds that apply to any Mercer kernel and depend on the spectrum of the associated integration operator. We emphasize that, unlike previous randomized approaches that rely on regularized leverage scores or determinantal point processes, evaluating the pdf of VS only requires pointwise evaluations of the kernel. VS is thus naturally amenable to MCMC samplers.

Dates et versions

hal-02697468 , version 1 (01-06-2020)

Identifiants

Citer

Ayoub Belhadji, R. Bardenet, Pierre Chainais. Kernel interpolation with continuous volume sampling. ICML 2020 - International Conference on Machine Learning, 2020, Vienna, Austria. ⟨hal-02697468⟩
107 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More