Improved Estimation of the Distance between Covariance Matrices

Malik Tiomoko 1 Romain Couillet 2 Eric Moisan 2 Steeve Zozor 2
GIPSA-DIS - Département Images et Signal
Abstract : A wide range of machine learning and signal processing applications involve data discrimination through covariance matrices. A broad family of metrics, among which the Frobe-nius, Fisher, Bhattacharyya distances, as well as the Kullback-Leibler or Rényi divergences, are regularly exploited. Not being directly accessible, these metrics are usually assessed through empirical sample covariances. We show here that, for large dimensional data, these approximations lead to dramatically erroneous distance and divergence estimates.In this article, based on advanced random matrix considerations, we provide a novel and versatile consistent estimate for these covariance matrix distances and divergences. While theoretically developed for both large and numerous data, practical simulations demonstrate its large performance gains over the standard approach even for very small dimensions. A particular emphasis is made on the Fisher information metric and a concrete application to covariance-based spectral clustering is investigated.
Document type :
Conference papers
Complete list of metadatas
Contributor : Malik Tiomoko <>
Submitted on : Friday, November 8, 2019 - 11:09:00 AM
Last modification on : Sunday, November 10, 2019 - 1:16:24 AM



Malik Tiomoko, Romain Couillet, Eric Moisan, Steeve Zozor. Improved Estimation of the Distance between Covariance Matrices. ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 2019, Brighton, United Kingdom. pp.7445-7449, ⟨10.1109/ICASSP.2019.8682621⟩. ⟨hal-02355321⟩



Record views