# Fast rates in learning with dependent observations

Abstract : In this paper we tackle the problem of fast rates in time series forecasting from a statistical learning perspective. In a serie of papers (e.g. Meir 2000, Modha and Masry 1998, Alquier and Wintenberger 2012) it is shown that the main tools used in learning theory with iid observations can be extended to the prediction of time series. The main message of these papers is that, given a family of predictors, we are able to build a new predictor that predicts the series as well as the best predictor in the family, up to a remainder of order $1/\sqrt{n}$. It is known that this rate cannot be improved in general. In this paper, we show that in the particular case of the least square loss, and under a strong assumption on the time series (phi-mixing) the remainder is actually of order $1/n$. Thus, the optimal rate for iid variables, see e.g. Tsybakov 2003, and individual sequences, see \cite{lugosi} is, for the first time, achieved for uniformly mixing processes. We also show that our method is optimal for aggregating sparse linear combinations of predictors.
Keywords :
Type de document :
Pré-publication, Document de travail
2012
Domaine :

Littérature citée [36 références]

https://hal.archives-ouvertes.fr/hal-00671979
Contributeur : Pierre Alquier <>
Soumis le : lundi 20 février 2012 - 11:20:01
Dernière modification le : jeudi 10 mai 2018 - 01:33:38
Document(s) archivé(s) le : lundi 21 mai 2012 - 02:21:29

### Fichiers

predum6.pdf
Fichiers produits par l'(les) auteur(s)

### Identifiants

• HAL Id : hal-00671979, version 1
• ARXIV : 1202.4283

### Citation

Pierre Alquier, Olivier Wintenberger. Fast rates in learning with dependent observations. 2012. 〈hal-00671979〉

### Métriques

Consultations de la notice

## 364

Téléchargements de fichiers