Optimal experimental design and quadratic optimization
Résumé
A well known gradient-type algorithm for solving quadratic optimization problems is the method of Steepest Descent. Here the Steepest Descent algorithm is generalized to a broader family of gradient algorithms, where the step-length is chosen in accordance with a particular procedure. The asymptotic rate of convergence of this family is studied. To facilitate the investigation, we re-write the algorithms in a normalized form which enables us to exploit a link with theory of optimum experimental design.
Origine : Accord explicite pour ce dépôt
Loading...