Skip to Main content Skip to Navigation

Model Consistency of Partly Smooth Regularizers

Samuel Vaiter 1 Gabriel Peyré 1 Jalal M. Fadili 2
2 Equipe Image - Laboratoire GREYC - UMR6072
GREYC - Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen
Abstract : This paper studies least-square regression penalized with partly smooth convex regularizers. This class of penalty functions is very large and versatile, and allows to promote solutions conforming to some notion of low-complexity. Indeed, such penalties/regularizers force the corresponding solutions to belong to a low-dimensional manifold (the so-called model) which remains stable when the penalty function undergoes small perturbations. Such a good sensitivity property is crucial to make the underlying low-complexity (manifold) model robust to small noise. In a deterministic setting, we show that a generalized "irrepresentable condition" implies stable model selection under small noise perturbations in the observations and the design matrix, when the regularization parameter is tuned proportionally to the noise level. We also prove that this condition is almost necessary for stable model recovery.We then turn to the random setting where the design matrix and the noise are random, and the number of observations grows large. We show that under our generalized "irrepresentable condition", and a proper scaling of the regularization parameter, the regularized estimator is model consistent. In plain words, with a probability tending to one as the number of measurements tends to infinity, the regularized estimator belongs to the correct low-dimensional model manifold.This work unifies and generalizes a large body of literature, where model consistency was known to hold, for instance for the Lasso, group Lasso, total variation (fused Lasso) and nuclear/trace norm regularizers.We show that under the deterministic model selection conditions, the forward-backward proximal splitting algorithm used to solve the penalized least-square regression problem, is guaranteed to identifiy the model manifold after a finite number of iterations. Lastly, we detail how our results extend from the quadratic loss to an arbitrary smooth and strictly convex loss function. We illustrate the usefulness of our results on the problem of low-rank matrix recovery from random measurements using nuclear norm minimization.
Complete list of metadatas
Contributor : Gabriel Peyré <>
Submitted on : Thursday, November 20, 2014 - 1:35:36 AM
Last modification on : Wednesday, September 23, 2020 - 4:28:12 AM
Long-term archiving on: : Friday, April 14, 2017 - 8:34:17 PM


Files produced by the author(s)


  • HAL Id : hal-00987293, version 4
  • ARXIV : 1405.1004


Samuel Vaiter, Gabriel Peyré, Jalal M. Fadili. Model Consistency of Partly Smooth Regularizers. [Research Report] CNRS. 2014. ⟨hal-00987293v4⟩



Record views


Files downloads