Algorithms for the optimal identification of segment neighborhoods, Bulletin of Mathematical Biology, vol.25, issue.1, pp.39-54, 1989. ,
DOI : 10.1007/978-1-4612-6137-7
Convex analysis and monotone operator theory in Hilbert spaces, 2011. ,
DOI : 10.1007/978-3-319-48311-5
URL : https://hal.archives-ouvertes.fr/hal-00643354
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems, SIAM Journal on Imaging Sciences, vol.2, issue.1, pp.183-202, 2009. ,
DOI : 10.1137/080716542
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.231.3271
Slope meets lasso: improved oracle bounds and optimality. arXiv preprint, 2016. ,
Square-root lasso: pivotal recovery of sparse signals via conic programming, Biometrika, vol.98, issue.4, pp.791-806, 2011. ,
DOI : 10.1093/biomet/asr043
URL : http://arxiv.org/abs/1009.5689
Active set algorithms for isotonic regression; A unifying framework, Mathematical Programming, vol.73, issue.1-3, pp.425-439, 1990. ,
DOI : 10.1007/BF01580873
Matrix analysis, volume 169 of Graduate Texts in Mathematics, 1997. ,
Simultaneous analysis of Lasso and Dantzig selector, The Annals of Statistics, vol.37, issue.4, pp.1705-1732, 2009. ,
DOI : 10.1214/08-AOS620
URL : https://hal.archives-ouvertes.fr/hal-00401585
OSMnx: New Methods for Acquiring, Constructing, Analyzing , and Visualizing Complex Street Networks. ArXiv e-prints ArXiv:1611, 1890. ,
DOI : 10.2139/ssrn.2865501
URL : http://arxiv.org/abs/1611.01890
SLOPE?Adaptive variable selection via convex optimization, The Annals of Applied Statistics, vol.9, issue.3, p.1103, 2015. ,
DOI : 10.1214/15-AOAS842SUPP
URL : http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4689150
Concentration inequalities: A nonasymptotic theory of independence, 2013. ,
DOI : 10.1093/acprof:oso/9780199535255.001.0001
URL : https://hal.archives-ouvertes.fr/hal-00794821
Proximal Splitting Methods in Signal Processing, Springer Optim. Appl, vol.49, pp.185-212, 2011. ,
DOI : 10.1007/978-1-4419-9569-8_10
URL : https://hal.archives-ouvertes.fr/hal-00643807
On the prediction performance of the Lasso, Bernoulli, vol.23, issue.1, pp.552-581, 2017. ,
DOI : 10.3150/15-BEJ756
CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration, SIAM Journal on Imaging Sciences, vol.10, issue.1, pp.243-284, 2017. ,
DOI : 10.1137/16M1080318
URL : https://hal.archives-ouvertes.fr/hal-01534202
Analysis versus synthesis in signal priors, Inverse Problems, vol.23, issue.3, pp.947-968, 2007. ,
DOI : 10.1088/0266-5611/23/3/007
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.183.1395
0 -estimation of piecewise-constant signals on graphs. ArXiv e-prints ArXiv:1703, 1421. ,
Optimal rates for total variation denoising. ArXiv e-prints ArXiv:1603.09388, 2016. ,
Locally adaptive regression splines, The Annals of Statistics, vol.25, issue.1, pp.387-413, 1997. ,
DOI : 10.1214/aos/1034276635
URL : http://archiv.ub.uni-heidelberg.de/volltextserver/21350/1/beitrag.10.pdf
Efficient smoothed concomitant lasso estimation for high dimensional regression, NCMIP, 2017. ,
URL : https://hal.archives-ouvertes.fr/hal-01404966
On spectral clustering: Analysis and an algorithm, NIPS, pp.849-856, 2001. ,
A robust hybrid of lasso and ridge regression, Contemporary Mathematics, vol.443, pp.59-72, 2007. ,
DOI : 10.1090/conm/443/08555
Scikit-learn: Machine learning in Python, J. Mach. Learn. Res, vol.12, pp.2825-2830, 2011. ,
URL : https://hal.archives-ouvertes.fr/hal-00650905
Nonlinear total variation based noise removal algorithms, Physica D: Nonlinear Phenomena, vol.60, issue.1-4, pp.259-268, 1992. ,
DOI : 10.1016/0167-2789(92)90242-F
Total variation classes beyond 1d: Minimax rates, and the limitations of linear smoothers, NIPS, pp.3513-3521, 2016. ,
Sparsistency of the edge lasso over graphs, AISTATS, pp.1028-1036, 2012. ,
Normalized cuts and image segmentation, IEEE Trans. Pattern Anal. Mach. Intell, vol.22, issue.8, pp.888-905, 2000. ,
SLOPE is adaptive to unknown sparsity and asymptotically minimax, The Annals of Statistics, vol.44, issue.3, pp.1038-1068, 2016. ,
DOI : 10.1214/15-AOS1397SUPP
URL : http://arxiv.org/abs/1503.08393
Scaled sparse linear regression, Biometrika, vol.99, issue.4, pp.879-898, 2012. ,
DOI : 10.1093/biomet/ass043
URL : http://arxiv.org/abs/1104.4595
Sparsity and smoothness via the fused lasso, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.99, issue.1, pp.91-108, 2005. ,
DOI : 10.1016/S0140-6736(02)07746-2
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.160.4573
On the conditions used to prove oracle results for the Lasso, Electronic Journal of Statistics, vol.3, issue.0, pp.1360-1392, 2009. ,
DOI : 10.1214/09-EJS506
On the robustness of the generalized fused lasso to prior specifications, Statistics and Computing, vol.101, issue.476, pp.285-301, 2016. ,
DOI : 10.1198/016214506000000735
URL : https://hal.archives-ouvertes.fr/hal-01067903
A tutorial on spectral clustering, Statistics and Computing, vol.21, issue.1, pp.395-416, 2007. ,
DOI : 10.1017/CBO9780511810633
Networks, Dynamics, and the Small?World Phenomenon, American Journal of Sociology, vol.105, issue.2, pp.493-527, 1999. ,
DOI : 10.1086/210318
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.78.4413
Rate minimaxity of the lasso and dantzig selector for the lq loss in lr balls, J. Mach. Learn. Res, vol.11, pp.3519-3540, 2010. ,
The Ordered Weighted 1 Norm: Atomic Formulation, Projections, and Algorithms. ArXiv e-prints arXiv:1409, 2014. ,