Optimal Computational Trade-Off of Inexact Proximal Methods (short version)

Abstract : In this paper, we investigate the trade-off between convergence rate and computational cost when minimizing a composite functional with proximal-gradient methods, which are popular optimisation tools in machine learning. We consider the case when the proximity operator is approximated via an iterative procedure, which yields algorithms with two nested loops. We show that the strategy minimizing the computational cost to reach a desired accuracy in finite time is to keep the number of inner iterations constant, which differs from the strategy indicated by a convergence rate analysis.
Document type :
Conference papers
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-00771722
Contributor : Pierre Machart <>
Submitted on : Wednesday, January 9, 2013 - 11:44:18 AM
Last modification on : Tuesday, April 2, 2019 - 2:03:18 AM
Long-term archiving on : Wednesday, April 10, 2013 - 3:50:24 AM

File

proxTradeoffWS.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00771722, version 1

Citation

Pierre Machart, Luca Baldassarre, Sandrine Anthoine. Optimal Computational Trade-Off of Inexact Proximal Methods (short version). Multi-Trade-offs in Machine Learning (NIPS), Dec 2012, Lake Tahoe, United States. ⟨hal-00771722⟩

Share

Metrics

Record views

772

Files downloads

203