Online Expectation Maximization based algorithms for inference in hidden Markov models
Résumé
The Expectation Maximization (EM) algorithm is a versatile tool for model parameter estimation in latent data models. When processing large data sets or data stream however, EM becomes intractable since it requires the whole data set to be available at each iteration of the algorithm. In this contribution, a new generic online EM algorithm for model parameter inference in general Hidden Markov Model is proposed. This new algorithm updates the parameter estimate after a block of observations is processed (online). The convergence of this new algorithm is established, and the rate of convergence is studied showing the impact of the block size. An averaging procedure is also proposed to improve the rate of convergence. Finally, practical illustrations are presented as well as extensions to some online stochastic EM when Sequential Monte Carlo methods have to be used in combination, in order to make the E-step tractable.
Origine : Fichiers produits par l'(les) auteur(s)