Variational Bayes logistic regression as regularized fusion for NIST SRE 2010

Abstract : Fusion of the base classifiers is seen as the way to achieve high performance in state-of-the-art speaker verification systems. Typically, we are looking for base classifiers that would be complementary. We might also be interested in reinforcing good base classifiers by including others that are similar to it. In any case, the final ensemble size is typically small and has to be formed based on some rules of thumb. We are interested to find out the subset of classifiers that has a good generalization performance. We approach the problem from the sparse learning point of view. We assume that the true, but unknown, fusion weights are actually sparse. As a practical solution we regular-ize the weighted logistic regression loss function by the Elastic-Net constraint. Though sparse solutions can be easily obtained using the so-called least absolute shrinkage and selection operator (LASSO), but it does not take into account high correlation between classifiers. Elastic-Net, on the other hand, is a compromise between LASSO and ridge regression constraints. While ridge regression cannot produce sparse solutions, Elastic-Net can. By using sparseness enforcing constraint we are able to improve over the un-regularized solution in all but tel-tel condition .
Type de document :
Communication dans un congrès
Speaker Odyssey, Jun 2012, Singapour, Singapore
Liste complète des métadonnées
Contributeur : Anthony Larcher <>
Soumis le : mardi 20 novembre 2018 - 00:05:38
Dernière modification le : jeudi 14 mars 2019 - 11:46:06


Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-01927593, version 1



Ville Hautamäki, Kong Lee, Anthony Larcher, Tomi Kinnunen, Haizhou Li. Variational Bayes logistic regression as regularized fusion for NIST SRE 2010. Speaker Odyssey, Jun 2012, Singapour, Singapore. 〈hal-01927593〉



Consultations de la notice


Téléchargements de fichiers