Skip to Main content Skip to Navigation
Conference papers

Comparing the modeling powers of RNN and HMM

Abstract : Recurrent Neural Networks (RNN) and Hidden Markov Models (HMM) are popular models for processing sequential data and have found many applications such as speech recognition, time series prediction or machine translation. Although both models have been extended in several ways (eg. Long Short Term Memory and Gated Recurrent Unit architec-tures, Variational RNN, partially observed Markov models.. .), their theoretical understanding remains partially open. In this context, our approach consists in classifying both models from an information geometry point of view. More precisely, both models can be used for modeling the distribution of a sequence of random observations from a set of latent variables; however, in RNN, the latent variable is deterministically deduced from the current observation and the previous latent variable, while, in HMM, the set of (random) latent variables is a Markov chain. In this paper, we first embed these two generative models into a generative unified model (GUM). We next consider the subclass of GUM models which yield a stationary Gaussian observations probability distribution function (pdf). Such pdf are characterized by their covariance sequence; we show that the GUM model can produce any stationary Gaussian distribution with geometrical covariance structure. We finally discuss about the modeling power of the HMM and RNN submodels, via their associated observations pdf: some observations pdf can be modeled by a RNN, but not by an HMM, and vice versa; some can be produced by both structures, up to a re-parameterization.
Complete list of metadatas

Cited literature [14 references]  Display  Hide  Download
Contributor : François Desbouvries <>
Submitted on : Friday, November 29, 2019 - 3:23:44 PM
Last modification on : Wednesday, October 14, 2020 - 4:21:44 AM


Files produced by the author(s)


  • HAL Id : hal-02387002, version 1


Achille Salaün, Yohan Petetin, François Desbouvries. Comparing the modeling powers of RNN and HMM. ICMLA 2019: 18th International Conference on Machine Learning and Applications, Dec 2019, Boca Raton, FL, United States. pp.1496-1499. ⟨hal-02387002⟩



Record views


Files downloads