A. Berlinet and C. Thomas-agnan, Reproducing Kernel Hilbert Spaces in Probability and Statistics, 2011.
DOI : 10.1007/978-1-4419-9096-9

C. Boutsidis, P. Drineas, and M. , Near-Optimal Column-Based Matrix Reconstruction, SIAM Journal on Computing, vol.43, issue.2, pp.687-717, 2014.
DOI : 10.1137/12086755X

URL : http://arxiv.org/abs/1103.0995

S. B. Damelin, A walk through energy, discrepancy, numerical integration and group invariant measures on measurable subsets of euclidean space, Numerical Algorithms, vol.74, issue.1, pp.213-235, 2008.
DOI : 10.1007/s11075-008-9187-6

P. Drineas and M. W. Mahoney, On the Nyström method for approximating a Gram matrix for improved kernel-based learning, Journal of Machine Learning Research, vol.6, pp.2153-2175, 2005.

B. Gauthier and L. Pronzato, Spectral Approximation of the IMSE Criterion for Optimal Designs in Kernel-Based Interpolation Models, SIAM/ASA Journal on Uncertainty Quantification, vol.2, issue.1, pp.805-825, 2014.
DOI : 10.1137/130928534

URL : https://hal.archives-ouvertes.fr/hal-00913466

B. Gauthier and L. Pronzato, Convex relaxation for IMSE optimal design in random-field models, Computational Statistics & Data Analysis, 2016.
DOI : 10.1016/j.csda.2016.10.018

URL : https://hal.archives-ouvertes.fr/hal-01246483

A. Gittens and M. W. Mahoney, Revisiting the Nyström method for improved large-scale machine learning, Journal of Machine Learning Research, vol.17, pp.1-65, 2016.

J. Guélat and P. Marcotte, Some comments on Wolfe's ???away step???, Mathematical Programming, pp.110-119, 1986.
DOI : 10.1007/BF01589445

W. Hackbusch, Integral Equations: Theory and Numerical Treatment, Birkhäuser, vol.120, pp.2012-2038
DOI : 10.1007/978-3-0348-9215-5

T. Hastie, S. Rosset, R. Tibshirani, and J. Zhu, The entire regularization path for the Support Vector Machine, Journal of Machine Learning Research, vol.5, pp.1391-1415, 2004.

T. Hastie, R. Tibshirani, and M. Wainwright, Statistical Learning with Sparsity: the Lasso and Generalizations, 2015.

S. Kumar, M. Mohri, and A. Talwalkar, Sampling methods for the Nyström method, Journal of Machine Learning Research, vol.13, pp.981-1006, 2012.

H. Niederreiter, Random Number Generation and Quasi-Monte Carlo Methods, 1992.
DOI : 10.1137/1.9781611970081

R. Michael, B. Osborne, B. A. Presnell, and . Turlach, A new approach to variable selection in least squares problems, IMA journal of numerical analysis, vol.20, issue.3, pp.389-403, 2000.

M. Y. , P. , and T. Hastie, L1-regularization path algorithm for generalized linear models, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.69, issue.4, pp.659-677, 2007.

L. Pronzato and A. Pázman, Design of Experiments in Nonlinear Models, 2013.
DOI : 10.1007/978-1-4614-6363-4

URL : https://hal.archives-ouvertes.fr/hal-00879984

B. Schölkopf, J. C. Platt, J. Shawe-taylor, A. J. Smola, and R. C. Williamson, Estimating the Support of a High-Dimensional Distribution, Neural Computation, vol.6, issue.1, pp.1443-1471, 2001.
DOI : 10.1214/aos/1069362732

C. Schwab and R. Todor, Karhunen???Lo??ve approximation of random fields by generalized fast multipole methods, Journal of Computational Physics, vol.217, issue.1, pp.100-122, 2006.
DOI : 10.1016/j.jcp.2006.01.048

L. Schwartz, Analyse Hilbertienne. Hermann, 1978.

S. Smale and D. Zhou, Geometry on probability spaces. Constructive Approximation, pp.311-323, 2009.
DOI : 10.1007/s00365-009-9070-2

I. Steinwart and A. Christmann, Support Vector Machines, 2008.

S. Wang and Z. Zhang, Improving CUR matrix decomposition and the Nyström approximation via adaptive sampling, Journal of Machine Learning Research, vol.14, pp.2729-2769, 2013.