Skip to Main content Skip to Navigation
Conference papers

Regularization in Relevance Learning Vector Quantization Using l one Norms

Abstract : We propose in this contribution a method for l one regularization in prototype based relevance learning vector quantization (LVQ) for sparse relevance profiles. Sparse relevance profiles in hyperspectral data analysis fade down those spectral bands which are not necessary for classification. In particular, we consider the sparsity in the relevance profile enforced by LASSO optimization. The latter one is obtained by a gradient learning scheme using a differentiable parametrized approximation of the $l_{1}$-norm, which has an upper error bound. We extend this regularization idea also to the matrix learning variant of LVQ as the natural generalization of relevance learning.
Complete list of metadatas

Cited literature [14 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00874854
Contributor : Fabrice Rossi <>
Submitted on : Friday, October 18, 2013 - 4:57:40 PM
Last modification on : Sunday, January 19, 2020 - 6:38:32 PM
Long-term archiving on: : Friday, April 7, 2017 - 1:22:16 PM

Files

L1_regularization_3_ESANN.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00874854, version 1
  • ARXIV : 1310.5095

Collections

Citation

Martin Riedel, Marika Kästner, Fabrice Rossi, Thomas Villmann. Regularization in Relevance Learning Vector Quantization Using l one Norms. 21-th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2013), Apr 2013, Bruges, Belgium. pp.17-22. ⟨hal-00874854⟩

Share

Metrics

Record views

330

Files downloads

220