Skip to Main content Skip to Navigation
Conference papers

Sensitivity analysis in HMMs with application to likelihood maximization

Pierre-Arnaud Coquelin 1 Romain Deguest 2 Rémi Munos 1
1 SEQUEL - Sequential Learning
LIFL - Laboratoire d'Informatique Fondamentale de Lille, Inria Lille - Nord Europe, LAGIS - Laboratoire d'Automatique, Génie Informatique et Signal
Abstract : This paper considers a sensitivity analysis in Hidden Markov Models with continuous state and observation spaces. We propose an Infinitesimal Perturbation Analysis (IPA) on the filtering distribution with respect to some parameters of the model. We describe a methodology for using any algorithm that estimates the filtering density, such as Sequential Monte Carlo methods, to design an algorithm that estimates its gradient. The resulting IPA estimator is proven to be asymptotically unbiased, consistent and has computational complexity linear in the number of particles. We consider an application of this analysis to the problem of identifying unknown parameters of the model given a sequence of observations. We derive an IPA estimator for the gradient of the log-likelihood, which may be used in a gradient method for the purpose of likelihood maximization. We illustrate the method with several numerical experiments.
Document type :
Conference papers
Complete list of metadatas

Cited literature [11 references]  Display  Hide  Download
Contributor : Rémi Munos <>
Submitted on : Tuesday, June 4, 2013 - 3:12:15 PM
Last modification on : Thursday, March 5, 2020 - 6:23:39 PM
Long-term archiving on: : Thursday, September 5, 2013 - 4:22:52 AM


Files produced by the author(s)


  • HAL Id : hal-00830166, version 1



Pierre-Arnaud Coquelin, Romain Deguest, Rémi Munos. Sensitivity analysis in HMMs with application to likelihood maximization. Advances in Neural Information Processing Systems, 2009, Canada. ⟨hal-00830166⟩



Record views


Files downloads