Skip to Main content Skip to Navigation
New interface
Journal articles

A Random Matrix Perspective on Random Tensors

Abstract : Several machine learning problems such as latent variable model learning and community detection can be addressed by estimating a low-rank signal from a noisy tensor. Despite recent substantial progress on the fundamental limits of the corresponding estimators in the large-dimensional setting, some of the most significant results are based on spin glass theory, which is not easily accessible to non-experts. We propose a sharply distinct and more elementary approach, relying on tools from random matrix theory. The key idea is to study random matrices arising from contractions of a random tensor, which give access to its spectral properties. In particular, for a symmetric dth-order rank-one model with Gaussian noise, our approach yields a novel characterization of maximum likelihood (ML) estimation performance in terms of a fixed-point equation valid in the regime where weak recovery is possible. For d=3, the solution to this equation matches the existing results. We conjecture that the same holds for any order d, based on numerical evidence for d∈{4,5}. Moreover, our analysis illuminates certain properties of the large-dimensional ML landscape. Our approach can be extended to other models, including asymmetric and non-Gaussian ones.
Complete list of metadata
Contributor : José Henrique De Morais Goulart Connect in order to contact the contributor
Submitted on : Sunday, October 2, 2022 - 5:39:08 PM
Last modification on : Tuesday, October 25, 2022 - 4:23:59 PM


Distributed under a Creative Commons Attribution 4.0 International License

Links full text


  • HAL Id : hal-03793940, version 1
  • ARXIV : 2108.00774


José Henrique de M Goulart, Romain Couillet, Pierre Comon. A Random Matrix Perspective on Random Tensors. Journal of Machine Learning Research, 2022, 23 (264), pp.1-36. ⟨hal-03793940⟩



Record views