Data sparse nonparametric regression with epsilon-insensitive losses

Abstract : Leveraging the celebrated support vector regression (SVR) method, we propose a unifying framework in order to deliver regression machines in reproducing kernel Hilbert spaces (RKHSs) with data sparsity. The central point is a new definition of epsilon-insensitivity, valid for many regression losses (including quantile and expectile regression) and their multivariate extensions. We show that the dual optimization problem to empirical risk minimization with epsilon-insensitivity involves a data sparse regularization. We also provide an analysis of the excess of risk as well as a randomized coordinate descent algorithm for solving the dual. Numerical experiments validate our approach.
Type de document :
Communication dans un congrès
9th Asian Conference on Machine Learning (ACML 2017), Nov 2017, Séoul, South Korea. 77, pp.192-207, 2017, Proceedings of Machine Learning Research. 〈http://www.acml-conf.org/2017/〉
Liste complète des métadonnées

Littérature citée [24 références]  Voir  Masquer  Télécharger

https://hal.archives-ouvertes.fr/hal-01593459
Contributeur : Maxime Sangnier <>
Soumis le : jeudi 2 août 2018 - 17:21:54
Dernière modification le : jeudi 21 mars 2019 - 14:32:06
Document(s) archivé(s) le : samedi 3 novembre 2018 - 15:38:40

Fichier

acml2017.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01593459, version 1

Citation

Maxime Sangnier, Olivier Fercoq, Florence D'Alché-Buc. Data sparse nonparametric regression with epsilon-insensitive losses. 9th Asian Conference on Machine Learning (ACML 2017), Nov 2017, Séoul, South Korea. 77, pp.192-207, 2017, Proceedings of Machine Learning Research. 〈http://www.acml-conf.org/2017/〉. 〈hal-01593459〉

Partager

Métriques

Consultations de la notice

317

Téléchargements de fichiers

19