Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Sparse within Sparse Gaussian Processes using Neighbor Information

Abstract : Approximations to Gaussian processes based on inducing variables, combined with variational inference techniques, enable state-of-the-art sparse approaches to infer GPs at scale through mini batch-based learning. In this work, we address one limitation of sparse GPs, which is due to the challenge in dealing with a large number of inducing variables without imposing a special structure on the inducing inputs. In particular, we introduce a novel hierarchical prior, which imposes sparsity on the set of inducing variables. We treat our model variationally, and we experimentally show considerable computational gains compared to standard sparse GPs when sparsity on the inducing variables is realized considering the nearest inducing inputs of a random mini-batch of the data. We perform an extensive experimental validation that demonstrates the effectiveness of our approach compared to the state-of-the-art. Our approach enables the possibility to use sparse GPs using a large number of inducing points without incurring a prohibitive computational cost.
Document type :
Preprints, Working Papers, ...
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03344895
Contributor : Centre de Documentation Eurecom Connect in order to contact the contributor
Submitted on : Wednesday, September 15, 2021 - 11:44:14 AM
Last modification on : Thursday, September 16, 2021 - 3:40:44 AM

Links full text

Identifiers

  • HAL Id : hal-03344895, version 1
  • ARXIV : 2011.05041

Collections

Citation

Gia-Lac Tran, Dimitrios Milios, Pietro Michiardi, Maurizio Filippone. Sparse within Sparse Gaussian Processes using Neighbor Information. 2020. ⟨hal-03344895⟩

Share

Metrics

Record views

9