Sparse Recovery from Extreme Eigenvalues Deviation Inequalities

Abstract : This article provides a new toolbox to derive sparse recovery guarantees - that is referred to as ‘‘stable and robust sparse regression'' (SRSR) ' from deviations on extreme singular values or extreme eigenvalues obtained in Random Matrix Theory. This work is based on Restricted Isometry Constants (RICs) which are a pivotal notion in Compressed Sensing and High-Dimensional Statistics as these constants finely assess how a linear operator is conditioned on the set of sparse vectors and hence how it performs in SRSR. While it is an open problem to construct deterministic matrices with apposite RICs, one can prove that such matrices exist using random matrices models. In this paper, we show upper bounds on RICs for Gaussian and Rademacher matrices using state-of-the-art deviation estimates on their extreme eigenvalues. This allows us to derive a lower bound on the probability of getting SRSR. One benefit of this paper is a direct and explicit derivation of upper bounds on RICs and lower bounds on SRSR from deviations on the extreme eigenvalues given by Random Matrix theory.
Complete list of metadatas

Cited literature [45 references]  Display  Hide  Download
Contributor : Sandrine Dallaporta <>
Submitted on : Monday, November 12, 2018 - 11:31:50 PM
Last modification on : Monday, April 8, 2019 - 6:18:27 PM
Long-term archiving on : Wednesday, February 13, 2019 - 4:50:14 PM


Files produced by the author(s)



Sandrine Dallaporta, Yohann de Castro. Sparse Recovery from Extreme Eigenvalues Deviation Inequalities. ESAIM: Probability and Statistics, EDP Sciences, 2019, ⟨10.1051/ps/2018024⟩. ⟨hal-01309439v4⟩



Record views


Files downloads