Prediction of time series by statistical learning: General losses and fast rates

Pierre Alquier, Xiaoyin Li, Olivier Wintenberger

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

We establish rates of convergences in statistical learning for time series forecasting. Using the PAC-Bayesian approach, slow rates of convergence √ d/n for the Gibbs estimator under the absolute loss were given in a previous work [7], where n is the sample size and d the dimension of the set of predictors. Under the same weak dependence conditions, we extend this result to any convex Lipschitz loss function. We also identify a condition on the parameter space that ensures similar rates for the classical penalized ERM procedure. We apply this method for quantile forecasting of the French GDP. Under additional conditions on the loss functions (satisfied by the quadratic loss function) and for uniformly mixing processes, we prove that the Gibbs estimator actually achieves fast rates of convergence d/n. We discuss the optimality of these different rates pointing out references to lower bounds when they are available. In particular, these results bring a generalization the results of [29] on sparse regression estimation to some autoregression.

Original languageEnglish (US)
Pages (from-to)65-93
Number of pages29
JournalDependence Modeling
Volume1
Issue number1
DOIs
StatePublished - 2013
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2013 Olivier Wintenberger et al., licensee Versita Sp. z o. o.

Keywords

  • Fast rates
  • GDP forecasting
  • Mixing
  • Oracle inequalities
  • PACBayesian bounds
  • Statistical learning theory
  • Time series forecasting
  • Weak dependence

Fingerprint

Dive into the research topics of 'Prediction of time series by statistical learning: General losses and fast rates'. Together they form a unique fingerprint.

Cite this