Skip to Main content Skip to Navigation
Journal articles

Properties of the Stochastic Approximation EM Algorithm with Mini-batch Sampling

Abstract : To deal with very large datasets a mini-batch version of the Monte Carlo Markov Chain Stochastic Approximation Expectation– Maximization algorithm for general latent variable models is proposed. For exponential models the algorithm is shown to be convergent under classical conditions as the number of iterations increases. Numerical experiments illustrate the performance of the mini-batch algorithm in various models. In particular, we highlight that mini-batch sampling results in an important speed-up of the convergence of the sequence of estimators generated by the algorithm. Moreover, insights on the effect of the mini-batch size on the limit distribution are presented. Finally, we illustrate how to use mini-batch sampling in practice to improve results when a constraint on the computing time is given.
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-02189215
Contributor : Catherine Matias Connect in order to contact the contributor
Submitted on : Friday, July 19, 2019 - 11:15:17 AM
Last modification on : Friday, December 3, 2021 - 11:43:07 AM

Files

preprint.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02189215, version 1
  • ARXIV : 1907.09164

Citation

Estelle Kuhn, Catherine Matias, Tabea Rebafka. Properties of the Stochastic Approximation EM Algorithm with Mini-batch Sampling. Statistics and Computing, Springer Verlag (Germany), 2020, 30 (6), pp.1725-1739. ⟨hal-02189215v1⟩

Share

Metrics

Les métriques sont temporairement indisponibles