Sensitivity analysis in HMMs with application to likelihood maximization

Pierre-Arnaud Coquelin 1 Romain Deguest 2 Rémi Munos 1
1 SEQUEL - Sequential Learning
LIFL - Laboratoire d'Informatique Fondamentale de Lille, LAGIS - Laboratoire d'Automatique, Génie Informatique et Signal, Inria Lille - Nord Europe
Abstract : This paper considers a sensitivity analysis in Hidden Markov Models with continuous state and observation spaces. We propose an Infinitesimal Perturbation Analysis (IPA) on the filtering distribution with respect to some parameters of the model. We describe a methodology for using any algorithm that estimates the filtering density, such as Sequential Monte Carlo methods, to design an algorithm that estimates its gradient. The resulting IPA estimator is proven to be asymptotically unbiased, consistent and has computational complexity linear in the number of particles. We consider an application of this analysis to the problem of identifying unknown parameters of the model given a sequence of observations. We derive an IPA estimator for the gradient of the log-likelihood, which may be used in a gradient method for the purpose of likelihood maximization. We illustrate the method with several numerical experiments.
Document type :
Conference papers
Complete list of metadatas

Cited literature [11 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00830166
Contributor : Rémi Munos <>
Submitted on : Tuesday, June 4, 2013 - 3:12:15 PM
Last modification on : Thursday, February 21, 2019 - 10:52:49 AM
Long-term archiving on : Thursday, September 5, 2013 - 4:22:52 AM

File

sensitivity_HMM_nips09.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00830166, version 1

Citation

Pierre-Arnaud Coquelin, Romain Deguest, Rémi Munos. Sensitivity analysis in HMMs with application to likelihood maximization. Advances in Neural Information Processing Systems, 2009, Canada. ⟨hal-00830166⟩

Share

Metrics

Record views

340

Files downloads

142