Skip to Main content Skip to Navigation
Conference papers

Variational Bayes logistic regression as regularized fusion for NIST SRE 2010

Abstract : Fusion of the base classifiers is seen as the way to achieve high performance in state-of-the-art speaker verification systems. Typically, we are looking for base classifiers that would be complementary. We might also be interested in reinforcing good base classifiers by including others that are similar to it. In any case, the final ensemble size is typically small and has to be formed based on some rules of thumb. We are interested to find out the subset of classifiers that has a good generalization performance. We approach the problem from the sparse learning point of view. We assume that the true, but unknown, fusion weights are actually sparse. As a practical solution we regular-ize the weighted logistic regression loss function by the Elastic-Net constraint. Though sparse solutions can be easily obtained using the so-called least absolute shrinkage and selection operator (LASSO), but it does not take into account high correlation between classifiers. Elastic-Net, on the other hand, is a compromise between LASSO and ridge regression constraints. While ridge regression cannot produce sparse solutions, Elastic-Net can. By using sparseness enforcing constraint we are able to improve over the un-regularized solution in all but tel-tel condition .
Document type :
Conference papers
Complete list of metadata

Cited literature [22 references]  Display  Hide  Download
Contributor : Anthony Larcher <>
Submitted on : Tuesday, November 20, 2018 - 12:05:38 AM
Last modification on : Thursday, December 19, 2019 - 1:50:04 PM


Files produced by the author(s)


  • HAL Id : hal-01927593, version 1



Ville Hautamäki, Kong Aik Lee, Anthony Larcher, Tomi Kinnunen, Haizhou Li. Variational Bayes logistic regression as regularized fusion for NIST SRE 2010. Speaker Odyssey, Jun 2012, Singapour, Singapore. ⟨hal-01927593⟩



Record views


Files downloads