Skip to Main content Skip to Navigation
Journal articles

Sparse Regularization on Thin Grids I: the LASSO

Abstract : This article analyzes the recovery performance in the presence of noise of sparse L1 regularization, which is often referred to as the Lasso or Basis-Pursuit. We study the behavior of the method for inverse problems regularization when the discretization step size tends to zero. We assume that the sought after sparse sum of Diracs is recovered when there is no noise (a condition which has been thoroughly studied in the literature) and we study what is the support (in particular the number of Dirac masses) estimated by the Lasso when noise is added to the observation. We identify a precise non-degeneracy condition that guarantees that the recovered support is close to the initial one. More precisely, we show that, in the small noise regime, when the non-degeneracy condition holds, this method estimates twice the number of spikes as the number of original spikes. Indeed, we prove that the Lasso detects two neighboring spikes around each location of an original spike. While this paper is focussed on cases where the observations vary smoothly with the spikes locations (e.g. the deconvolution problem with a smooth kernel), an interesting by-product is an abstract analysis of the support stability of discrete L1 regularization, which is of an independent interest. We illustrate the usefulness of this abstract analysis to analyze for the first time the support instability of compressed sensing recovery.
Complete list of metadatas

Cited literature [33 references]  Display  Hide  Download
Contributor : Gabriel Peyré <>
Submitted on : Monday, October 31, 2016 - 9:27:59 AM
Last modification on : Thursday, November 5, 2020 - 12:58:02 PM


Files produced by the author(s)



Vincent Duval, Gabriel Peyré. Sparse Regularization on Thin Grids I: the LASSO. Inverse Problems, IOP Publishing, 2017, 33 (5), ⟨10.1088/1361-6420/aa5e12⟩. ⟨hal-01135200v2⟩



Record views


Files downloads