Averaging Stochastic Gradient Descent on Riemannian Manifolds - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2018

Averaging Stochastic Gradient Descent on Riemannian Manifolds

Résumé

We consider the minimization of a function defined on a Riemannian manifold $\mathcal{M}$ accessible only through unbiased estimates of its gradients. We develop a geometric framework to transform a sequence of slowly converging iterates generated from stochastic gradient descent (SGD) on $\mathcal{M}$ to an averaged iterate sequence with a robust and fast $O(1/n)$ convergence rate. We then present an application of our framework to geodesically-strongly-convex (and possibly Euclidean non-convex) problems. Finally, we demonstrate how these ideas apply to the case of streaming $k$-PCA, where we show how to accelerate the slow rate of the randomized power method (without requiring knowledge of the eigengap) into a robust algorithm achieving the optimal rate of convergence.

Dates et versions

hal-01957015 , version 1 (17-12-2018)

Identifiants

Citer

Nilesh Tripuraneni, Nicolas Flammarion, Francis Bach, Michael I. Jordan. Averaging Stochastic Gradient Descent on Riemannian Manifolds. Computational Learning Theory (COLT), Jul 2018, Stockholm, Sweden. ⟨hal-01957015⟩
49 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More