Geometry-aware stationary subspace analysis - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2016

Geometry-aware stationary subspace analysis

Résumé

In many real-world applications data exhibits non-stationarity, i.e., its distribution changes over time. One approach to handling non-stationarity is to remove or minimize it before attempting to analyze the data. In the context of brain computer interface (BCI) data analysis this is sometimes achieved using stationary subspace analysis (SSA). The classic SSA method finds a matrix that projects the data onto a stationary subspace by optimizing a cost function based on a matrix divergence. In this work we present an alternative method for SSA based on a symmetrized version of this matrix divergence. We show that this frames the problem in terms of distances between symmetric positive definite (SPD) matrices, suggesting a geometric interpretation of the problem. Stemming from this geometric viewpoint, we introduce and analyze a method which utilizes the geometry of the SPD matrix manifold and the invariance properties of its metrics. Most notably we show that these invariances alleviate the need to whiten the input matrices, a common step in many SSA methods which often introduces error. We demonstrate the usefulness of our technique in experiments on both synthetic and real-world data.
Fichier non déposé

Dates et versions

hal-01447959 , version 1 (27-01-2017)

Identifiants

  • HAL Id : hal-01447959 , version 1

Citer

Inbal Horev, Florian Yger, Masashi Sugiyama. Geometry-aware stationary subspace analysis. 8th Asian Conference on Machine Learning (ACML 2016), Nov 2016, Hamilton, New Zealand. pp.430-444. ⟨hal-01447959⟩
49 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More