Expectation Maximization
Expectation-Maximization (EM) is an iterative algorithm used to find maximum likelihood estimates of parameters in statistical models with latent variables. Current research focuses on improving EM's convergence speed and robustness, particularly for high-dimensional data and complex models like Gaussian mixtures, softmax mixtures, and those used in deep learning (e.g., transformers and diffusion models). These advancements are impacting diverse fields, including federated learning, blind signal processing, and medical imaging, by enabling more efficient and accurate parameter estimation in challenging scenarios with incomplete or noisy data. The development of theoretically grounded variants and efficient implementations continues to be a major focus.
Papers
Unveiling the Cycloid Trajectory of EM Iterations in Mixed Linear Regression
Zhankun Luo, Abolfazl Hashemi
Non-negative Tensor Mixture Learning for Discrete Density Estimation
Kazu Ghalamkari, Jesper Løve Hinrich, Morten Mørup
SEMF: Supervised Expectation-Maximization Framework for Predicting Intervals
Ilia Azizi, Marc-Olivier Boldi, Valérie Chavez-Demoulin