R. F. Barber and E. J. Candès, Controlling the false discovery rate via knockoffs, The Annals of Statistics, vol.43, issue.5, pp.2055-2085, 2015.

M. Bayati and A. Montanari, The LASSO risk for Gaussian matrices, IEEE Transactions on Information Theory, vol.58, issue.4, pp.1997-2017, 2012.

P. Dimitri and . Bertsekas, Nonlinear programming. Athena scientific Belmont, 1999.

M. Bogdan, E. J. Candès, W. Su, and A. Weinstein, Off the beaten path: ranking variables with crossvalidated lasso, 2018.

P. Bühlmann and S. Van-de-geer, Statistics for High-Dimensional Data: Methods, Theory and Applications, 2011.

T. Cai and A. Zhang, Sharp rip bound for sparse signal and low-rank matrix recovery, Applied and Computational Harmonic Analysis, vol.35, issue.1, pp.74-93, 2013.

J. Emmanuel and . Candes, The restricted isometry property and its implications for compressed sensing, Comptes Rendus Mathematique, vol.346, issue.9, pp.589-592, 2008.

E. J. Candès, Y. Fan, L. Janson, and J. Lv, Panning for gold: Model-free knockoffs for high-dimensional controlled variable selection, Journal of the Royal Statistical Society Series B, 2016.

D. L. Scott-shaobing-chen, M. Donoho, and . Saunders, Atomic decomposition by basis pursuit, SIAM review, vol.43, issue.1, pp.129-159, 2001.

I. Daubechies, R. Devore, M. Fornasier, and C. Güntürk, Iteratively reweighted least squares minimization for sparse recovery, Communications on pure and applied mathematics, vol.63, issue.1, pp.1-38, 2010.

P. Descloux and S. Sardy, Model selection with lasso-zero: adding straw to the haystack to better find needles, 2018.

D. L. Donoho and J. Tanner, Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing, Philosophical Trans. R. Soc. A, vol.367, pp.4273-4293, 1906.

L. David, M. Donoho, and . Elad, Optimally sparse representation in general (nonorthogonal) dictionaries via 1 minimization, Proceedings of the National Academy of Sciences, vol.100, issue.5, pp.2197-2202, 2003.

L. David, J. Donoho, and . Tanner, Precise undersampling theorems, Proceedings of the IEEE, vol.98, issue.6, pp.913-924, 2010.

C. Dossal, A necessary and sufficient condition for exact sparse recovery by 1 minimization, Comptes Rendus Mathematique, vol.350, issue.1, pp.117-120, 2012.

C. Dossal, M. Chabanol, G. Peyré, and J. Fadili, Sharp support recovery from noisy random measurements by 1-minimization, Applied and Computational Harmonic Analysis, vol.33, issue.1, pp.24-43, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00553670

S. Foucart and H. Rauhut, A mathematical introduction to compressive sensing, vol.1, 2013.

C. Giacobino, S. Sardy, J. Diaz-rodriguez, and N. Hengartner, Quantile universal threshold, Electronic Journal of Statistics, vol.11, issue.2, pp.4701-4722, 2017.

R. Gribonval and M. Nielsen, Sparse representations in unions of bases, IEEE Transactions on Information Theory, vol.49, issue.12, pp.3320-3325, 2003.
URL : https://hal.archives-ouvertes.fr/inria-00071943

J. Huang, S. Ma, and C. Zhang, Adaptive lasso for sparse high-dimensional regression models, Statistica Sinica, vol.18, pp.1603-1618, 2008.

N. Meinshausen and P. Bühlmann, High-dimensional graphs and variable selection with the lasso, The Annals of Statistics, vol.34, issue.3, pp.1436-1462, 2006.

J. Weijie, M. Su, E. J. Lgorzata-bogdan, and . Candès, False discoveries occur early on the lasso path, The Annals of Statistics, vol.45, issue.5, pp.2133-2150, 2017.

P. Tardivel, Représentation parcimonieuse et procédures de tests multiples: applicationà la métabolomique, 2017.

J. C. Patrick, R. Tardivel, D. Servien, and . Concordet, Sparsest representations and approximations of an underdetermined linear system. Inverse Problems, vol.34, p.55002, 2018.

R. Tibshirani, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society. Series B (Methodological), vol.58, issue.1, pp.267-288, 1996.

J. Ryan and . Tibshirani, The lasso problem and uniqueness, Electronic Journal of Statistics, vol.7, pp.1456-1490, 2013.

J. Martin and . Wainwright, Sharp thresholds for high-dimensional and noisy sparsity recovery using constrained quadratic programming (lasso), IEEE transactions on information theory, vol.55, pp.2183-2202, 2009.

S. Wang, H. Weng, and A. Maleki, Which bridge estimator is the best for variable selection ? arxiv, 2018.

S. Wang, H. Weng, and A. Maleki, Which bridge estimator is optimal for variable selection?, 2017.

A. Weinstein, R. Barber, and E. J. Candès, A power and prediction analysis for knockoffs with lasso statistics, 2017.

P. Zhao and B. Yu, On model selection consistency of lasso, The Journal of Machine Learning Research, vol.7, pp.2541-2563, 2006.

H. Zou, The adaptive lasso and its oracle properties, Journal of the American statistical association, vol.101, issue.476, pp.1418-1429, 2006.