# Uniform regret bounds over $R^d$ for the sequential linear regression problem with the square loss

1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We consider the setting of online linear regression for arbitrary deterministic sequences, with the square loss. We are interested in the aim set by Bartlett et al. (2015): obtain regret bounds that hold uniformly over all competitor vectors. When the feature sequence is known at the beginning of the game, they provided closed-form regret bounds of 2d B^2 ln T + O(1), where T is the number of rounds and B is a bound on the observations. Instead, we derive bounds with an optimal constant of 1 in front of the d B^2 ln T term. In the case of sequentially revealed features, we also derive an asymptotic regret bound of d B^2 ln T for any individual sequence of features and bounded observations. All our algorithms are variants of the online non-linear ridge regression forecaster, either with a data-dependent regularization or with almost no regularization.
Keywords :
Type de document :
Article dans une revue
Proceedings of Machine Learning Research, PMLR, In press, 98, pp.1-29
Domaine :

https://hal.archives-ouvertes.fr/hal-01802004
Contributeur : Gilles Stoltz <>
Soumis le : mardi 19 février 2019 - 19:12:41
Dernière modification le : mardi 26 février 2019 - 17:08:39

### Fichiers

Gaillard-Gerchinovitz-Huard-St...
Fichiers produits par l'(les) auteur(s)

### Identifiants

• HAL Id : hal-01802004, version 2
• ARXIV : 1805.11386

### Citation

Pierre Gaillard, Sébastien Gerchinovitz, Malo Huard, Gilles Stoltz. Uniform regret bounds over $R^d$ for the sequential linear regression problem with the square loss. Proceedings of Machine Learning Research, PMLR, In press, 98, pp.1-29. 〈hal-01802004v2〉

### Métriques

Consultations de la notice

## 40

Téléchargements de fichiers