Skip to Main content Skip to Navigation
Conference papers

Unsupervised Scalable Representation Learning for Multivariate Time Series

Abstract : Time series constitute a challenging data type for machine learning algorithms, due to their highly variable lengths and sparse labeling in practice. In this paper, we tackle this challenge by proposing an unsupervised method to learn universal embeddings of time series. Unlike previous works, it is scalable with respect to their length and we demonstrate the quality, transferability and practicability of the learned representations with thorough experiments and comparisons. To this end, we combine an encoder based on causal dilated convolutions with a novel triplet loss employing time-based negative sampling, obtaining general-purpose representations for variable length and multivariate time series.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01998101
Contributor : Jean-Yves Franceschi <>
Submitted on : Wednesday, December 18, 2019 - 3:53:55 PM
Last modification on : Monday, February 24, 2020 - 1:27:51 PM
Document(s) archivé(s) le : Thursday, March 19, 2020 - 10:15:28 PM

Files

article-supplementary-NeurIPS1...
Files produced by the author(s)

Licence


Distributed under a Creative Commons Attribution 4.0 International License

Identifiers

  • HAL Id : hal-01998101, version 4
  • ARXIV : 1901.10738

Citation

Jean-Yves Franceschi, Aymeric Dieuleveut, Martin Jaggi. Unsupervised Scalable Representation Learning for Multivariate Time Series. Thirty-third Conference on Neural Information Processing Systems, Neural Information Processing Systems Foundation, Dec 2019, Vancouver, Canada. pp.4650-4661. ⟨hal-01998101v4⟩

Share

Metrics

Record views

119

Files downloads

70