Skip to Main content Skip to Navigation
Journal articles

An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost

Abstract : We consider gradient algorithms for minimizing a quadratic function in R^n with large n. We suggest a particular sequence of step-lengthes and demonstrate that the resulting gradient algorithm has a convergence rate comparable with that of Conjugate Gradients and other methods based on the use of Krylov spaces. When the problem is large and sparse, the proposed algorithm can be more efficient than the Conjugate Gradient algorithm in terms of computational cost, as k iterations of the proposed algorithm only require the computation of O(log k) inner products.
Complete list of metadatas

Cited literature [10 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00630982
Contributor : Luc Pronzato <>
Submitted on : Tuesday, October 11, 2011 - 11:15:43 AM
Last modification on : Tuesday, May 26, 2020 - 6:50:35 PM
Document(s) archivé(s) le : Thursday, January 12, 2012 - 2:25:50 AM

File

Bukina-P-Z_OPTIMIZATION-HAL.pd...
Files produced by the author(s)

Identifiers

Collections

Citation

Anatoly Zhigljavsky, Luc Pronzato, Elena Bukina. An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost. Optimization Letters, Springer Verlag, 2013, 7 (6), pp.1047-1059. ⟨10.1007/s11590-012-0491-7⟩. ⟨hal-00630982⟩

Share

Metrics

Record views

369

Files downloads

347