Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression

Aymeric Dieuleveut 1, 2, * Nicolas Flammarion 1, 2 Francis Bach 1, 2
* Auteur correspondant
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We consider the optimization of a quadratic objective function whose gradients are only accessible through a stochastic oracle that returns the gradient at any given point plus a zero-mean finite variance random error. We present the first algorithm that achieves jointly the optimal prediction error rates for least-squares regression, both in terms of forgetting of initial conditions in O(1/n 2), and in terms of dependence on the noise and dimension d of the problem, as O(d/n). Our new algorithm is based on averaged accelerated regularized gradient descent, and may also be analyzed through finer assumptions on initial conditions and the Hessian matrix, leading to dimension-free quantities that may still be small while the " optimal " terms above are large. In order to characterize the tightness of these new bounds, we consider an application to non-parametric regression and use the known lower bounds on the statistical performance (without computational limits), which happen to match our bounds obtained from a single pass on the data and thus show optimality of our algorithm in a wide variety of particular trade-offs between bias and variance.
Type de document :
Article dans une revue
Journal of Machine Learning Research (JMLR), 2017, 17 (101), pp.1-51
Liste complète des métadonnées

Littérature citée [45 références]  Voir  Masquer  Télécharger

Contributeur : Nicolas Flammarion <>
Soumis le : mardi 23 février 2016 - 21:56:47
Dernière modification le : jeudi 7 février 2019 - 15:49:16
Document(s) archivé(s) le : mardi 24 mai 2016 - 11:07:01


Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-01275431, version 2
  • ARXIV : 1602.05419


Aymeric Dieuleveut, Nicolas Flammarion, Francis Bach. Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression. Journal of Machine Learning Research (JMLR), 2017, 17 (101), pp.1-51. 〈hal-01275431v2〉



Consultations de la notice


Téléchargements de fichiers