M-Power Regularized Least Squares Regression - Archive ouverte HAL Accéder directement au contenu
Rapport Année : 2013

M-Power Regularized Least Squares Regression

Résumé

Regularization is used to find a solution that both fits the data and is sufficiently smooth, and thereby is very effective for designing and refining learning algorithms. But the influence of its exponent remains poorly understood. In particular, it is unclear how the exponent of the reproducing kernel Hilbert space~(RKHS) regularization term affects the accuracy and the efficiency of kernel-based learning algorithms. Here we consider regularized least squares regression (RLSR) with an RKHS regularization raised to the power of m, where m is a variable real exponent. We design an efficient algorithm for solving the associated minimization problem, we provide a theoretical analysis of its stability, and we compare its advantage with respect to computational complexity, speed of convergence and prediction accuracy to the classical kernel ridge regression algorithm where the regularization exponent m is fixed at 2. Our results show that the m-power RLSR problem can be solved efficiently, and support the suggestion that one can use a regularization term that grows significantly slower than the standard quadratic growth in the RKHS norm.
Fichier principal
Vignette du fichier
MRLSR_ARXIV.pdf (633.84 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-00871214 , version 1 (09-10-2013)
hal-00871214 , version 2 (14-12-2016)

Identifiants

Citer

Julien Audiffren, Hachem Kadri. M-Power Regularized Least Squares Regression. 2013. ⟨hal-00871214v1⟩

Collections

LARA
201 Consultations
223 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More