Using computer algebra to certify the global convergence of a numerical optimization process - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Mathematics in Computer Science Année : 2007

Using computer algebra to certify the global convergence of a numerical optimization process

Résumé

The basic objective of blind signal separation is to recover a set of source signals from a set of observations that are mixtures of the sources with no, or very limited knowledge about the mixture structure and source signals. To extract the original sources, many algorithms have been proposed; among them, the cross-correlation and constant modulus algorithm (CC-CMA) appears to be the algorithm of choice due to its computational simplicity. An important issue in CC-CMA algorithm is the global convergence analysis, because the cost function is not quadratic nor convex and contains undesirable stationary points. If these undesirable points are local minimums, the convergence of the algorithm may not be guaranteed and the CC-CMA would fail to separate source signals. The main result of this paper is to complete the classification of these stationary points and to prove that they are not local minimums unless if the mixing parameter is equal to 1. This is obtained by using the theory of discriminant varieties to determine the stationnary points as a function of the parameter and then to show that the Hessian matrix of the cost function is not positive semidefinite at these stationnay points, unless if the mixing parameter is 1.
Fichier non déposé

Dates et versions

hal-01148502 , version 1 (04-05-2015)

Identifiants

Citer

Gu Nong, Daniel Lazard, Fabrice Rouillier, Xiang Yong. Using computer algebra to certify the global convergence of a numerical optimization process. Mathematics in Computer Science, 2007, 1 (2), pp.291-304. ⟨10.1007/s11786-007-0021-7⟩. ⟨hal-01148502⟩
38 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More