Skip to Main content Skip to Navigation
Conference papers

Improved Estimation of the Distance between Covariance Matrices

Abstract : A wide range of machine learning and signal processing applications involve data discrimination through covariance matrices. A broad family of metrics, among which the Frobe-nius, Fisher, Bhattacharyya distances, as well as the Kullback-Leibler or Rényi divergences, are regularly exploited. Not being directly accessible, these metrics are usually assessed through empirical sample covariances. We show here that, for large dimensional data, these approximations lead to dramatically erroneous distance and divergence estimates. In this article, based on advanced random matrix considerations , we provide a novel and versatile consistent estimate for these covariance matrix distances and divergences. While theoretically developed for both large and numerous data, practical simulations demonstrate its large performance gains over the standard approach even for very small dimensions. A particular emphasis is made on the Fisher information metric and a concrete application to covariance-based spectral clustering is investigated. Index Terms-Covariance distance, random matrix theory , Fisher information metric.
Document type :
Conference papers
Complete list of metadata

Cited literature [15 references]  Display  Hide  Download
Contributor : Malik Tiomoko Connect in order to contact the contributor
Submitted on : Tuesday, May 19, 2020 - 1:59:13 PM
Last modification on : Wednesday, November 3, 2021 - 5:12:46 AM


Files produced by the author(s)



Malik Tiomoko, Romain Couillet, Eric Moisan, Steeve Zozor. Improved Estimation of the Distance between Covariance Matrices. ICASSP 2019 - IEEE International Conference on Acoustics, Speech and Signal Processing, May 2019, Brighton, United Kingdom. pp.7445-7449, ⟨10.1109/ICASSP.2019.8682621⟩. ⟨hal-02355321⟩



Les métriques sont temporairement indisponibles