Abstract : In this paper, we are concerned with regularized regression problems where the prior regularizer is a proper lower semicontinuous and convex function which is also partly smooth relative to a Riemannian submanifold. This encompasses as special cases several known penalties such as the Lasso ($\ell^1$-norm), the group Lasso ($\ell^1-\ell^2$-norm), the $\ell^\infty$-norm, and the nuclear norm. This also includes so-called analysis-type priors, i.e. composition of the previously mentioned penalties with linear operators, typical examples being the total variation or fused Lasso penalties.
We study the sensitivity of any regularized minimizer to perturbations of the observations and provide its precise local parameterization.
Our main sensitivity analysis result shows that the predictor moves locally stably along the same active submanifold as the observations undergo small perturbations. This local stability is a consequence of the smoothness of the regularizer when restricted to the active submanifold, which in turn plays a pivotal role to get a closed form expression for the variations of the predictor w.r.t. observations. We also show that, for a variety of regularizers, including polyhedral ones or the group Lasso and its analysis counterpart, this divergence formula holds Lebesgue almost everywhere.
When the perturbation is random (with an appropriate continuous distribution), this allows us to derive an unbiased estimator of the degrees of freedom and of the risk of the estimator prediction.
Our results hold true without requiring the design matrix to be full column rank.
They generalize those already known in the literature such as the Lasso problem, the general Lasso problem (analysis $\ell^1$-penalty), or the group Lasso where existing results for the latter assume that the design is full column rank.