Expectation Maximization
Expectation-Maximization (EM) is an iterative algorithm used to find maximum likelihood estimates of parameters in statistical models with latent variables. Current research focuses on improving EM's convergence speed and robustness, particularly for high-dimensional data and complex models like Gaussian mixtures, softmax mixtures, and those used in deep learning (e.g., transformers and diffusion models). These advancements are impacting diverse fields, including federated learning, blind signal processing, and medical imaging, by enabling more efficient and accurate parameter estimation in challenging scenarios with incomplete or noisy data. The development of theoretically grounded variants and efficient implementations continues to be a major focus.
Papers
Theoretical Analysis for Expectation-Maximization-Based Multi-Model 3D Registration
David Jin, Harry Zhang, Kai Chang
Towards Geometry-Aware Pareto Set Learning for Neural Multi-Objective Combinatorial Optimization
Yongfan Lu, Zixiang Di, Bingdong Li, Shengcai Liu, Hong Qian, Peng Yang, Ke Tang, Aimin Zhou