Regularization with the Smooth-Lasso procedure - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2008

Regularization with the Smooth-Lasso procedure

Résumé

We consider the linear regression problem. We propose the S-Lasso procedure to estimate the unknown regression parameters. This estimator enjoys sparsity of the representation while taking into account correlation between successive covariates (or predictors). The study covers the case when $p\gg n$, i.e. the number of covariates is much larger than the number of observations. In the theoretical point of view, for fixed $p$, we establish asymptotic normality and consistency in variable selection results for our procedure. When $p\geq n$, we provide variable selection consistency results and show that the S-Lasso achieved a Sparsity Inequality, i.e., a bound in term of the number of non-zero components of the oracle vector. It appears that the S-Lasso has nice variable selection properties compared to its challengers. Furthermore, we provide an estimator of the effective degree of freedom of the S-Lasso estimator. A simulation study shows that the S-Lasso performs better than the Lasso as far as variable selection is concerned especially when high correlations between successive covariates exist. This procedure also appears to be a good challenger to the Elastic-Net (Zou and Hastie, 2005).
Fichier principal
Vignette du fichier
Smooth_Lasso2.pdf (367.34 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-00260816 , version 1 (05-03-2008)
hal-00260816 , version 2 (15-10-2008)

Identifiants

Citer

Mohamed Hebiri. Regularization with the Smooth-Lasso procedure. 2008. ⟨hal-00260816v2⟩
131 Consultations
935 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More