Shape and Time Distortion Loss for Training Deep Time Series Forecasting Models

Abstract : This paper addresses the problem of time series forecasting for non-stationarysignals and multiple future steps prediction. To handle this challenging task, weintroduce DILATE (DIstortion Loss including shApe and TimE), a new objectivefunction for training deep neural networks. DILATE aims at accurately predictingsudden changes, and explicitly incorporates two terms supporting precise shapeand temporal change detection. We introduce a differentiable loss function suitablefor training deep neural nets, and provide a custom back-prop implementation forspeeding up optimization. We also introduce a variant of DILATE, which providesa smooth generalization of temporally-constrained Dynamic Time Warping (DTW).Experiments carried out on various non-stationary datasets reveal the very goodbehaviour of DILATE compared to models trained with the standard Mean SquaredError (MSE) loss function, and also to DTW and variants. DILATE is also agnosticto the choice of the model, and we highlight its benefit for training fully connectednetworks as well as specialized recurrent architectures, showing its capacity toimprove over state-of-the-art trajectory forecasting approaches.
Complete list of metadatas

Cited literature [63 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02291601
Contributor : Vincent Le Guen <>
Submitted on : Thursday, September 19, 2019 - 12:30:42 PM
Last modification on : Wednesday, February 5, 2020 - 5:38:13 PM
Long-term archiving on: Saturday, February 8, 2020 - 11:17:44 PM

Files

nips_2019.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02291601, version 1
  • ARXIV : 1909.09020

Collections

Citation

Vincent Le Guen, Nicolas Thome. Shape and Time Distortion Loss for Training Deep Time Series Forecasting Models. Advances in Neural Information Processing Systems 32 (NeurIPS 2019), Dec 2019, Vancouver, Canada. ⟨hal-02291601v1⟩

Share

Metrics

Record views

114

Files downloads

194