Polyak Steps for Adaptive Fast Gradient Method - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2019

Polyak Steps for Adaptive Fast Gradient Method

Mathieu Barré

Résumé

Accelerated algorithms for minimizing smooth strongly convex functions usually require knowledge of the strong convexity parameter $\mu$. In the case of an unknown $\mu$, current adaptive techniques are based on restart schemes. When the optimal value $f^*$ is known, these strategies recover the accelerated linear convergence bound without additional grid search. In this paper we propose a new approach that has the same bound without any restart, using an online estimation of strong convexity parameter. We show the robustness of the Fast Gradient Method when using a sequence of upper bounds on $\mu$. We also present a good candidate for this estimate sequence and detail consistent empirical results.

Dates et versions

hal-02340373 , version 1 (30-10-2019)

Identifiants

Citer

Mathieu Barré, Alexandre d'Aspremont. Polyak Steps for Adaptive Fast Gradient Method. 2019. ⟨hal-02340373⟩
1460 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More