Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

A New Class of EM Algorithms. Escaping Local Minima and Handling Intractable Sampling

Abstract : The expectation-maximization (EM) algorithm is a powerful computational technique for maximum likelihood estimation in incomplete data models. When the expectation step cannot be performed in closed form, a stochastic approximation of EM (SAEM) can be used. The convergence of the SAEM toward local maxima of the observed likelihood has been proved and its numerical efficiency has been demonstrated. However, despite appealing features, the limit position of this algorithm can strongly depend on its starting position. Moreover, sampling from the posterior distribution may be intractable or have a high computational cost. To cope with this two issues, we propose here a new stochastic approximation version of the EM in which we do not sample from the exact distribution in the expectation phase of the procedure. We first prove the convergence of this algorithm toward local maxima of the observed likelihood. Then, we propose an instantiation of this general procedure to favor convergence toward global maxima. Experiments on synthetic and real data highlight the performance of this algorithm in comparison to the SAEM.
Complete list of metadata

Cited literature [24 references]  Display  Hide  Download
Contributor : Juliette Chevallier Connect in order to contact the contributor
Submitted on : Tuesday, June 18, 2019 - 9:53:50 PM
Last modification on : Friday, December 3, 2021 - 11:43:07 AM


Files produced by the author(s)


  • HAL Id : hal-02044722, version 3



Stéphanie Allassonnière, Juliette Chevallier. A New Class of EM Algorithms. Escaping Local Minima and Handling Intractable Sampling. 2019. ⟨hal-02044722v3⟩



Les métriques sont temporairement indisponibles