Uniform regret bounds over $R^d$ for the sequential linear regression problem with the square loss - Archive ouverte HAL Access content directly
Conference Papers Year : 2019

Uniform regret bounds over $R^d$ for the sequential linear regression problem with the square loss

Abstract

We consider the setting of online linear regression for arbitrary deterministic sequences, with the square loss. We are interested in the aim set by Bartlett et al. (2015): obtain regret bounds that hold uniformly over all competitor vectors. When the feature sequence is known at the beginning of the game, they provided closed-form regret bounds of 2d B^2 ln T + O(1), where T is the number of rounds and B is a bound on the observations. Instead, we derive bounds with an optimal constant of 1 in front of the d B^2 ln T term. In the case of sequentially revealed features, we also derive an asymptotic regret bound of d B^2 ln T for any individual sequence of features and bounded observations. All our algorithms are variants of the online non-linear ridge regression forecaster, either with a data-dependent regularization or with almost no regularization.
Fichier principal
Vignette du fichier
Gaillard-Gerchinovitz-Huard-Stoltz--Regret-Rd.pdf (329.9 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01802004 , version 1 (28-05-2018)
hal-01802004 , version 2 (19-02-2019)

Identifiers

Cite

Pierre Gaillard, Sébastien Gerchinovitz, Malo Huard, Gilles Stoltz. Uniform regret bounds over $R^d$ for the sequential linear regression problem with the square loss. The 30th International Conference on Algorithmic Learning Theory (ALT 2019), Mar 2019, Chicago, United States. pp.404-432. ⟨hal-01802004v2⟩
413 View
335 Download

Altmetric

Share

Gmail Facebook X LinkedIn More