M-Power Regularized Least Squares Regression

Abstract : Regularization is used to find a solution that both fits the data and is sufficiently smooth, and thereby is very effective for designing and refining learning algorithms. But the influence of its exponent remains poorly understood. In particular, it is unclear how the exponent of the reproducing kernel Hilbert space~(RKHS) regularization term affects the accuracy and the efficiency of kernel-based learning algorithms. Here we consider regularized least squares regression (RLSR) with an RKHS regularization raised to the power of m, where m is a variable real exponent. We design an efficient algorithm for solving the associated minimization problem, we provide a theoretical analysis of its stability, and we compare its advantage with respect to computational complexity, speed of convergence and prediction accuracy to the classical kernel ridge regression algorithm where the regularization exponent m is fixed at 2. Our results show that the m-power RLSR problem can be solved efficiently, and support the suggestion that one can use a regularization term that grows significantly slower than the standard quadratic growth in the RKHS norm.
Complete list of metadatas

Cited literature [20 references]  Display  Hide  Download

Contributor : Julien Audiffren <>
Submitted on : Wednesday, December 14, 2016 - 10:56:56 AM
Last modification on : Monday, March 4, 2019 - 2:04:23 PM


Files produced by the author(s)


  • HAL Id : hal-00871214, version 2
  • ARXIV : 1310.2451



Julien Audiffren, Hachem Kadri. M-Power Regularized Least Squares Regression. International Joint Conference on Neural Networks (IJCNN), May 2017, Anchorage, Alaska, United States. ⟨hal-00871214v2⟩



Record views


Files downloads