# Linear Regression with Random Projections

1 SEQUEL - Sequential Learning
LIFL - Laboratoire d'Informatique Fondamentale de Lille, Inria Lille - Nord Europe, LAGIS - Laboratoire d'Automatique, Génie Informatique et Signal
Abstract : We investigate a method for regression that makes use of a randomly generated subspace $G_P$ (of finite dimension $P$) of a given large (possibly infinite) dimensional function space $F$, for example, $L_{2}([0,1]^d)$. $G_P$ is defined as the span of $P$ random features that are linear combinations of a basis functions of $F$ weighted by random Gaussian i.i.d.~coefficients. We show practical motivation for the use of this approach, detail the link that this random projections method share with RKHS and Gaussian objects theory and prove, both in deterministic and random design, approximation error bounds when searching for the best regression function in $G_P$ rather than in $F$, and derive excess risk bounds for a specific regression algorithm (least squares regression in $G_P$). This paper stresses the motivation to study such methods, thus the analysis developed is kept simple for explanations purpose and leaves room for future developments.
Document type :
Journal articles

Cited literature [37 references]

https://hal.archives-ouvertes.fr/hal-00771487
Contributor : Rémi Munos <>
Submitted on : Tuesday, January 8, 2013 - 5:21:07 PM
Last modification on : Thursday, June 27, 2019 - 1:36:06 PM
Long-term archiving on: Tuesday, April 9, 2013 - 3:56:45 AM

### File

JMLR_random_proj_2012.pdf
Files produced by the author(s)

### Identifiers

• HAL Id : hal-00771487, version 1

### Citation

Odalric Maillard, Rémi Munos. Linear Regression with Random Projections. Journal of Machine Learning Research, Microtome Publishing, 2012, 13, pp.2735-2772. ⟨hal-00771487⟩

Record views