Model Consistency of Partly Smooth Regularizers - Archive ouverte HAL Accéder directement au contenu
Rapport (Rapport De Recherche) Année : 2014

Model Consistency of Partly Smooth Regularizers

Résumé

This paper studies least-square regression penalized with partly smooth convex regularizers. This class of penalty functions is very large and versatile, and allows to promote solutions conforming to some notion of low-complexity. Indeed, such penalties/regularizers force the corresponding solutions to belong to a low-dimensional manifold (the so-called model) which remains stable when the penalty function undergoes small perturbations. Such a good sensitivity property is crucial to make the underlying low-complexity (manifold) model robust to small noise. In a deterministic setting, we show that a generalized "irrepresentable condition" implies stable model selection under small noise perturbations in the observations and the design matrix, when the regularization parameter is tuned proportionally to the noise level. We also prove that this condition is almost necessary for stable model recovery.We then turn to the random setting where the design matrix and the noise are random, and the number of observations grows large. We show that under our generalized "irrepresentable condition", and a proper scaling of the regularization parameter, the regularized estimator is model consistent. In plain words, with a probability tending to one as the number of measurements tends to infinity, the regularized estimator belongs to the correct low-dimensional model manifold.This work unifies and generalizes a large body of literature, where model consistency was known to hold, for instance for the Lasso, group Lasso, total variation (fused Lasso) and nuclear/trace norm regularizers.We show that under the deterministic model selection conditions, the forward-backward proximal splitting algorithm used to solve the penalized least-square regression problem, is guaranteed to identifiy the model manifold after a finite number of iterations. Lastly, we detail how our results extend from the quadratic loss to an arbitrary smooth and strictly convex loss function. We illustrate the usefulness of our results on the problem of low-rank matrix recovery from random measurements using nuclear norm minimization.
Fichier principal
Vignette du fichier
PartlySmoothSensitivity.pdf (347.05 Ko) Télécharger le fichier
figures/phasetrans-nuclear-meas-eps-converted-to.pdf (4.94 Ko) Télécharger le fichier
figures/phasetrans-nuclear-meas.pdf (4.94 Ko) Télécharger le fichier
figures/phasetrans-nuclear-rank-eps-converted-to.pdf (4.65 Ko) Télécharger le fichier
figures/phasetrans-nuclear-rank.pdf (4.65 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Origine : Fichiers produits par l'(les) auteur(s)
Origine : Fichiers produits par l'(les) auteur(s)
Origine : Fichiers produits par l'(les) auteur(s)
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-00987293 , version 1 (05-05-2014)
hal-00987293 , version 2 (07-06-2014)
hal-00987293 , version 3 (29-06-2014)
hal-00987293 , version 4 (20-11-2014)

Identifiants

Citer

Samuel Vaiter, Gabriel Peyré, Jalal M. Fadili. Model Consistency of Partly Smooth Regularizers. [Research Report] CNRS. 2014. ⟨hal-00987293v4⟩
1680 Consultations
606 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More