Bounds on Lyapunov Exponents via Entropy Accumulation - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Information Theory Année : 2021

Bounds on Lyapunov Exponents via Entropy Accumulation

Résumé

Lyapunov exponents describe the asymptotic behavior of the singular values of large products of random matrices. A direct computation of these exponents is however often infeasible. By establishing a link between Lyapunov exponents and an information theoretic tool called entropy accumulation theorem we derive an upper and a lower bound for the maximal and minimal Lyapunov exponent, respectively. The bounds assume independence of the random matrices, are analytical, and are tight in the commutative case as well as in other scenarios. They can be expressed in terms of an optimization problem that only involves single matrices rather than large products. The upper bound for the maximal Lyapunov exponent can be evaluated efficiently via the theory of convex optimization.

Dates et versions

hal-03130062 , version 1 (03-02-2021)

Identifiants

Citer

David Sutter, Omar Fawzi, Renato Renner. Bounds on Lyapunov Exponents via Entropy Accumulation. IEEE Transactions on Information Theory, 2021, 67 (1), pp.10-24. ⟨10.1109/TIT.2020.3026959⟩. ⟨hal-03130062⟩
21 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More