The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$ - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue SIAM Journal on Optimization Année : 2016

The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$

Résumé

The forward-backward algorithm is a powerful tool for solving optimization problems with an additively separable and smooth plus nonsmooth structure. In the convex setting, a simple but ingenious acceleration scheme developed by Nesterov improves the theoretical rate of convergence for the function values from the standard $\mathcal O(k^{-1})$ down to $\mathcal O(k^{-2})$. In this short paper, we prove that the rate of convergence of a slight variant of Nesterov's accelerated forward-backward method, which produces convergent sequences, is actually $o(k^{-2})$, rather than $\mathcal O(k^{-2})$. Our arguments rely on the connection between this algorithm and a second-order differential inclusion with vanishing damping.

Dates et versions

hal-02072679 , version 1 (19-03-2019)

Identifiants

Citer

Hedy Attouch, Juan Peypouquet. The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$. SIAM Journal on Optimization, 2016, 26 (3), pp.1824-1834. ⟨10.1137/15M1046095⟩. ⟨hal-02072679⟩
47 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More