C. C. Aggarwal and P. S. Yu, Outlier detection for high dimensional data, ACM SIGMOD Record, vol.30, issue.2, pp.37-46, 2001.
DOI : 10.1145/376284.375668

F. Bach, R. Jenatton, J. Mairal, and G. Obozinski, Optimization with sparsityinducing penalties, Machine Learning, pp.1-106, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00613125

F. Bach, R. Jenatton, J. Mairal, and G. Obozinski, Structured Sparsity through Convex Optimization, Statistical Science, vol.27, issue.4, pp.450-468, 2012.
DOI : 10.1214/12-STS394

URL : https://hal.archives-ouvertes.fr/hal-00621245

Y. Benjamini and Y. Hochberg, Controlling the false discovery rate: a practical and powerful approach to multiple testing, Journal of the royal statistical society. Series B (Methodological), pp.289-300, 1995.

M. Bogdan, E. Van-den-berg, C. Sabatti, W. Su, and E. J. Candès, SLOPE???Adaptive variable selection via convex optimization, The Annals of Applied Statistics, vol.9, issue.3, pp.1103-1140, 2015.
DOI : 10.1214/15-AOAS842SUPP

URL : http://europepmc.org/articles/pmc4689150?pdf=render

M. Bogdan, E. Van-den-berg, W. Su, and E. J. Candès, Statistical estimation and testing via the ordered 1 norm. arXiv:1310, 1969.

P. Bühlmann, S. Van-de, and . Geer, Statistics for high-dimensional data: methods, theory and applications, 2011.
DOI : 10.1007/978-3-642-20192-9

P. C. Bellec, G. Lecué, and A. B. Tsybakov, Slope meets lasso : improved oracle bounds and optimality. preprint ArXiv, 2016.

E. J. Candès, J. Romberg, and T. Tao, Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information, IEEE Transactions on Information Theory, vol.52, issue.2, pp.489-509, 2006.
DOI : 10.1109/TIT.2005.862083

E. J. Candès, J. K. Romberg, and T. Tao, Stable signal recovery from incomplete and inaccurate measurements, Communications on Pure and Applied Mathematics, vol.7, issue.8, pp.591207-1223, 2006.
DOI : 10.1017/CBO9780511804441

V. Chandrasekaran, B. Recht, P. A. Parrilo, and A. S. Willsky, The Convex Geometry of Linear Inverse Problems, Foundations of Computational Mathematics, vol.1, issue.10, pp.805-849, 2012.
DOI : 10.1007/978-1-4613-8431-1

URL : http://arxiv.org/pdf/1012.0621

S. S. Chen, D. L. Donoho, and M. A. Saunders, Atomic Decomposition by Basis Pursuit, SIAM Review, vol.43, issue.1, pp.129-159, 2001.
DOI : 10.1137/S003614450037906X

URL : http://www-stat.stanford.edu/~donoho/Reports/1995/30401.pdf

S. Chrétien and S. Darses, Sparse Recovery With Unknown Variance: A LASSO-Type Approach, IEEE Transactions on Information Theory, vol.60, issue.7, pp.3970-3988, 2014.
DOI : 10.1109/TIT.2014.2301162

R. Cook and S. Weisberg, Residuals and influence in regression, 1982.

W. J. Dixon, Analysis of extreme values. The Annals of Mathematical Statistics, pp.488-506, 1950.
DOI : 10.1214/aoms/1177729747

URL : http://doi.org/10.1214/aoms/1177729747

A. Duval, S. Rolland, A. Compoint, E. Tubacher, B. Iacopetta et al., Evolution of instability at coding and non-coding repeat sequences in human MSI-H colorectal cancers, Human Molecular Genetics, vol.10, issue.5, pp.513-518, 2001.
DOI : 10.1093/hmg/10.5.513

URL : https://academic.oup.com/hmg/article-pdf/10/5/513/9464434/dde054.pdf

J. Fan and R. Li, Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties, Journal of the American Statistical Association, vol.96, issue.456, pp.1348-1360, 2001.
DOI : 10.1198/016214501753382273

URL : http://www.stat.psu.edu/~rli/research/penlike.pdf

X. Gao and Y. Fang, Penalized weighted least squares for outlier detection and robust regression, 2016.

D. Gervini and V. J. Yohai, A class of robust and fully efficient regression estimators, The Annals of Statistics, vol.30, issue.2, pp.583-616, 2002.
DOI : 10.1214/aos/1021379866

URL : http://doi.org/10.1214/aos/1021379866

C. Giraud, Introduction to High-Dimensional Statistics, 2014.

F. E. Grubbs, Procedures for Detecting Outlying Observations in Samples, Technometrics, vol.6, issue.7, pp.1-21, 1969.
DOI : 10.1214/aoms/1177732567

URL : http://www.dtic.mil/cgi-bin/GetTRDoc?AD=AD0781499&Location=U2&doc=GetTRDoc.pdf

A. Gupta and S. Kohli, An MCDM approach towards handling outliers in web data: a case study using OWA operators, Artificial Intelligence Review, vol.25, issue.7, pp.59-82, 2016.
DOI : 10.1007/978-94-010-0646-0

N. H. Nguyen and T. D. Tran, Robust Lasso With Missing and Grossly Corrupted Observations, IEEE Transactions on Information Theory, vol.59, issue.4, pp.2036-2058, 2013.
DOI : 10.1109/TIT.2012.2232347

URL : http://arxiv.org/pdf/1112.0391

A. S. Hadi, A new measure of overall potential influence in linear regression, Computational Statistics & Data Analysis, vol.14, issue.1, pp.1-27, 1992.
DOI : 10.1016/0167-9473(92)90078-T

A. S. Hadi and J. S. Simonoff, Procedures for the Identification of Multiple Outliers in Linear Models, Journal of the American Statistical Association, vol.15, issue.424, pp.1264-1272, 1993.
DOI : 10.1080/01621459.1988.10478611

D. M. Hawkins, Identification of outliers, 1980.
DOI : 10.1007/978-94-015-3994-4

P. J. Huber, The 1972 wald lecture robust statistics: A review. The Annals of Mathematical Statistics, pp.1041-1067, 1972.
DOI : 10.1214/aoms/1177692459

URL : http://doi.org/10.1214/aoms/1177692459

P. J. Huber, Wiley Series in Probability and Mathematics Statistics, Robust statistics, pp.309-312, 1981.
DOI : 10.1002/0471725250.scard

P. J. Bickel, Y. Ritov, and A. B. Tsybakov, Simultaneous analysis of lasso and dantzig selector. The Annals of Statistics, pp.1705-1732, 2009.
DOI : 10.1214/08-aos620

URL : https://hal.archives-ouvertes.fr/hal-00401585

E. J. Candès and Y. Plan, Near-ideal model selection by 1-minimization. The Annals of Statistics, pp.2145-2177, 2009.

M. J. Wainwright, A. Virouleau, S. Guilloux, and . Ga¨iffasga¨ga¨iffas, Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso), IEEE Transactions on Information Theory, vol.55, issue.5, pp.2183-2202, 2009.
DOI : 10.1109/TIT.2009.2016018

V. Koltchinskii, Saint flour lectures oracle inequalities in empirical risk minimization and sparse recovery problems, 2008.
DOI : 10.1007/978-3-642-22147-7

D. L. Donoho and X. Huo, Uncertainty principles and ideal atomic decomposition, IEEE Transactions on Information Theory, vol.47, issue.7, pp.2845-2862, 2001.
DOI : 10.1109/18.959265

URL : http://www-stat.stanford.edu/~donoho/Reports/1999/IADUP.pdf

B. Laurent and P. Massart, Adaptive estimation of a quadratic functional of a density by model selection, ESAIM: Probability and Statistics, vol.28, issue.5, pp.1302-1338, 2000.
DOI : 10.1214/aos/1015957395

G. Raskutti, M. J. Wainwright, and B. Yu, Restricted eigenvalue properties for correlated gaussian design, Journal of Machine Learning Research, vol.11, pp.2241-2259, 2010.

K. Ro, C. Zou, Z. Wang, and G. Yin, Outlier detection for high-dimensional data, Biometrika, vol.102, issue.3, pp.589-599, 2015.
DOI : 10.1080/00401706.2012.694781

P. Rousseeuw and V. Yohai, Robust regression by means of s-estimators. In Robust and nonlinear time series analysis, pp.256-272, 1984.
DOI : 10.1007/978-1-4615-7821-5_15

Y. She and A. B. Owen, Outlier Detection Using Nonconvex Penalized Regression, Journal of the American Statistical Association, vol.106, issue.494, 2010.
DOI : 10.1198/jasa.2011.tm10390

URL : http://www-stat.stanford.edu/%7Eowen/reports/theta-ipod.pdf

A. F. Siegel, Robust regression using repeated medians, Biometrika, vol.69, issue.1, pp.242-244, 1982.
DOI : 10.1093/biomet/69.1.242

W. Su, M. Bogdan, and E. J. Candès, False discoveries occur early on the Lasso path, The Annals of Statistics, vol.45, issue.5, pp.2133-2150, 2017.
DOI : 10.1214/16-AOS1521SUPP

URL : http://arxiv.org/pdf/1511.01957

W. Su and E. J. Candès, SLOPE is adaptive to unknown sparsity and asymptotically minimax, The Annals of Statistics, vol.44, issue.3, pp.1038-1068, 2016.
DOI : 10.1214/15-AOS1397SUPP

URL : http://arxiv.org/pdf/1503.08393

T. Sun and C. Zhang, Scaled sparse linear regression, Biometrika, vol.7, issue.39, pp.879-898, 2012.
DOI : 10.1214/08-AOS659

N. Suraweera, B. Iacopetta, A. Duval, A. Compoint, E. Tubacher et al., Conservation of mononucleotide repeats within 3??? and 5??? untranslated regions and their instability in MSI-H colorectal cancer, Oncogene, vol.20, issue.51, pp.207472-7477, 2001.
DOI : 10.1038/sj.onc.1201337

URL : http://www.nature.com/onc/journal/v20/n51/pdf/1204952a.pdf

R. Tibshirani, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society. Series B (Methodological), pp.267-288, 1996.
DOI : 10.1111/j.1467-9868.2011.00771.x

R. Tibshirani, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society, vol.58, issue.1, pp.267-288, 1996.
DOI : 10.1111/j.1467-9868.2011.00771.x

H. Wang, G. Li, and G. Jiang, Robust Regression Shrinkage and Consistent Variable Selection Through the LAD-Lasso, Journal of Business & Economic Statistics, vol.25, issue.3, pp.347-355, 2007.
DOI : 10.1198/073500106000000251

T. Wang and Z. Li, Outlier detection in high-dimensional regression model, Communications in Statistics - Theory and Methods, vol.41, issue.1, 2016.
DOI : 10.1214/13-AOS1165

S. Weisberg, Applied linear regression, 2005.
DOI : 10.1002/0471704091

C. Yu and W. Yao, Robust linear regression: A review and comparison, Communications in Statistics -Simulation and Computation, 2016.
DOI : 10.1080/03610918.2016.1202271

URL : http://arxiv.org/pdf/1404.6274.pdf

Y. Zhang, M. J. Wainwright, and M. I. Jordan, Lower bounds on the performance of polynomial-time algorithms for sparse linear regression, Proceedings of The 27th Conference on Learning Theory, COLT 2014, pp.921-948, 2014.

H. Zou, The Adaptive Lasso and Its Oracle Properties, Journal of the American Statistical Association, vol.101, issue.476, 2006.
DOI : 10.1198/016214506000000735

URL : http://cbio.ensmp.fr/~jvert/svn/bibli/local/Zou2006adaptive.pdf