Mini-batch learning of exponential family finite mixture models - MISTIS
Article Dans Une Revue Statistics and Computing Année : 2020

Mini-batch learning of exponential family finite mixture models

Résumé

Mini-batch algorithms have become increasingly popular due to the requirement for solving optimization problems, based on large-scale data sets. Using an existing online expectation-maximization (EM) algorithm framework, we demonstrate how mini-batch (MB) algorithms may be constructed, and propose a scheme for the stochastic stabilization of the constructed mini-batch algorithms. Theoretical results regarding the convergence of the mini-batch EM algorithms are presented. We then demonstrate how the mini-batch framework may be applied to conduct maximum likelihood (ML) estimation of mixtures of exponential family distributions, with emphasis on ML estimation for mixtures of normal distributions. Via a simulation study, we demonstrate that the mini-batch algorithm for mixtures of normal distributions can outperform the standard EM algorithm. Further evidence of the performance of the mini-batch framework is provided via an application to the famous MNIST data set.
Fichier principal
Vignette du fichier
20190612_R1-FF2.pdf (1.27 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02415068 , version 1 (16-12-2019)
hal-02415068 , version 2 (26-03-2020)

Identifiants

Citer

Hien D Nguyen, Florence Forbes, Geoffrey J Mclachlan. Mini-batch learning of exponential family finite mixture models. Statistics and Computing, 2020, 30, pp.731-748. ⟨10.1007/s11222-019-09919-4⟩. ⟨hal-02415068v2⟩
151 Consultations
229 Téléchargements

Altmetric

Partager

More