A New Class of EM Algorithms. Escaping Local Minima and Handling Intractable Sampling

Abstract : The expectation-maximization (EM) algorithm is a powerful computational technique for maximum likelihood estimation in incomplete data models. When the expectation step cannot be performed in closed form, a stochastic approximation of EM (SAEM) can be used. The convergence of the SAEM toward local maxima of the observed likelihood has been proved and its numerical efficiency has been demonstrated. However, despite appealing features, the limit position of this algorithm can strongly depend on its starting position. Moreover, sampling from the posterior distribution may be intractable or have a high computational cost. To cope with this two issues, we propose here a new stochastic approximation version of the EM in which we do not sample from the exact distribution in the expectation phase of the procedure. We first prove the convergence of this algorithm toward local maxima of the observed likelihood. Then, we propose an instantiation of this general procedure to favor convergence toward global maxima. Experiments on synthetic and real data highlight the performance of this algorithm in comparison to the SAEM.
Complete list of metadatas

Cited literature [24 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02044722
Contributor : Juliette Chevallier <>
Submitted on : Tuesday, June 18, 2019 - 9:53:50 PM
Last modification on : Monday, July 8, 2019 - 2:59:21 PM

File

2019_Allassonniere_Chevallier....
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02044722, version 3

Citation

Stéphanie Allassonnière, Juliette Chevallier. A New Class of EM Algorithms. Escaping Local Minima and Handling Intractable Sampling. 2019. ⟨hal-02044722v3⟩

Share

Metrics

Record views

111

Files downloads

72