Skip to Main content Skip to Navigation
Conference papers

Data sparse nonparametric regression with epsilon-insensitive losses

Abstract : Leveraging the celebrated support vector regression (SVR) method, we propose a unifying framework in order to deliver regression machines in reproducing kernel Hilbert spaces (RKHSs) with data sparsity. The central point is a new definition of epsilon-insensitivity, valid for many regression losses (including quantile and expectile regression) and their multivariate extensions. We show that the dual optimization problem to empirical risk minimization with epsilon-insensitivity involves a data sparse regularization. We also provide an analysis of the excess of risk as well as a randomized coordinate descent algorithm for solving the dual. Numerical experiments validate our approach.
Complete list of metadatas

Cited literature [47 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01593459
Contributor : Maxime Sangnier <>
Submitted on : Thursday, August 2, 2018 - 5:21:54 PM
Last modification on : Wednesday, June 24, 2020 - 4:19:04 PM
Document(s) archivé(s) le : Saturday, November 3, 2018 - 3:38:40 PM

File

acml2017.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01593459, version 1

Citation

Maxime Sangnier, Olivier Fercoq, Florence d'Alché-Buc. Data sparse nonparametric regression with epsilon-insensitive losses. 9th Asian Conference on Machine Learning (ACML 2017), Nov 2017, Séoul, South Korea. pp.192-207. ⟨hal-01593459⟩

Share

Metrics

Record views

456

Files downloads

79