Skip to Main content Skip to Navigation
Conference papers

Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model

Raphaël Berthier 1, 2 Francis Bach 2, 1 Pierre Gaillard 2, 1, 3
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
3 Thoth - Apprentissage de modèles à partir de données massives
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann
Abstract : In the context of statistical supervised learning, the noiseless linear model assumes that there exists a deterministic linear relation $Y = \langle \theta_*, X \rangle$ between the random output $Y$ and the random feature vector $\Phi(U)$, a potentially non-linear transformation of the inputs $U$. We analyze the convergence of single-pass, fixed step-size stochastic gradient descent on the least-square risk under this model. The convergence of the iterates to the optimum $\theta_*$ and the decay of the generalization error follow polynomial convergence rates with exponents that both depend on the regularities of the optimum $\theta_*$ and of the feature vectors $\Phi(u)$. We interpret our result in the reproducing kernel Hilbert space framework. As a special case, we analyze an online algorithm for estimating a real function on the unit interval from the noiseless observation of its value at randomly sampled points; the convergence depends on the Sobolev smoothness of the function and of a chosen kernel. Finally, we apply our analysis beyond the supervised learning setting to obtain convergence rates for the averaging process (a.k.a. gossip algorithm) on a graph depending on its spectral dimension.
Complete list of metadata

Cited literature [33 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02866755
Contributor : Raphaël Berthier Connect in order to contact the contributor
Submitted on : Monday, October 26, 2020 - 5:20:05 PM
Last modification on : Friday, October 15, 2021 - 1:41:22 PM

Files

neurips_2020.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02866755, version 2
  • ARXIV : 2006.08212

Citation

Raphaël Berthier, Francis Bach, Pierre Gaillard. Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model. NeurIPS '20 - 34th International Conference on Neural Information Processing Systems, Dec 2020, Vancouver, Canada. pp.2576--2586. ⟨hal-02866755v2⟩

Share

Metrics

Record views

9072

Files downloads

150