Duality between subgradient and conditional gradient methods - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue SIAM Journal on Optimization Année : 2015

Duality between subgradient and conditional gradient methods

Résumé

Given a convex optimization problem and its dual, there are many possible first-order algorithms. In this paper, we show the equivalence between mirror descent algorithms and algorithms generalizing the conditional gradient method. This is done through convex duality, and implies notably that for certain problems, such as for supervised machine learning problems with non-smooth losses or problems regularized by non-smooth regularizers, the primal subgradient method and the dual conditional gradient method are formally equivalent. The dual interpretation leads to a form of line search for mirror descent, as well as guarantees of convergence for primal-dual certificates.
Fichier principal
Vignette du fichier
94196.pdf (237.41 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Loading...

Dates et versions

hal-00757696 , version 1 (27-11-2012)
hal-00757696 , version 2 (02-01-2013)
hal-00757696 , version 3 (18-10-2013)
hal-00757696 , version 4 (03-02-2015)

Identifiants

Citer

Francis Bach. Duality between subgradient and conditional gradient methods. SIAM Journal on Optimization, 2015, 25 (1), pp.115-129. ⟨10.1137/130941961⟩. ⟨hal-00757696v4⟩
305 Consultations
812 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More