Skip to Main content Skip to Navigation
Journal articles

Leveraging Joint-Diagonalization in Transform-Learning NMF

Sixin Zhang 1 Emmanuel Soubies 1 Cédric Févotte 1 
1 IRIT-SC - Signal et Communications
IRIT - Institut de recherche en informatique de Toulouse
Abstract : Non-negative matrix factorization with transform learning (TL-NMF) is a recent idea that aims at learning data representations suited to NMF. In this work, we relate TL-NMF to the classical matrix joint-diagonalization (JD) problem. We show that, when the number of data realizations is sufficiently large, TL-NMF can be replaced by a two-step approach -- termed as JD+NMF -- that estimates the transform through JD, prior to NMF computation. In contrast, we found that when the number of data realizations is limited, not only is JD+NMF no longer equivalent to TL-NMF, but the inherent low-rank constraint of TL-NMF turns out to be an essential ingredient to learn meaningful transforms for NMF.
Document type :
Journal articles
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03481041
Contributor : Zhang Sixin Connect in order to contact the contributor
Submitted on : Wednesday, December 15, 2021 - 9:57:10 AM
Last modification on : Tuesday, August 23, 2022 - 9:11:28 AM

Licence


Distributed under a Creative Commons Attribution 4.0 International License

Links full text

Identifiers

Citation

Sixin Zhang, Emmanuel Soubies, Cédric Févotte. Leveraging Joint-Diagonalization in Transform-Learning NMF. IEEE Transactions on Signal Processing, Institute of Electrical and Electronics Engineers, 2022, ⟨10.1109/TSP.2022.3188177⟩. ⟨hal-03481041⟩

Share

Metrics

Record views

53