Expectation Maximization Algorithm
The Expectation-Maximization (EM) algorithm is an iterative method for finding maximum likelihood estimates of parameters in statistical models with latent variables, aiming to maximize the likelihood of observed data by iteratively improving estimates of both observed and hidden data. Current research focuses on extending EM's capabilities to handle diverse data types and model architectures, including diffusion models, Gaussian mixture models, and neural networks, often incorporating techniques like variance reduction and distributed computation to improve efficiency and scalability. This versatile algorithm finds applications across numerous fields, from image processing and speech enhancement to privacy-preserving machine learning and robust statistical inference, driving advancements in various scientific domains and practical applications.
Papers
The hybrid approach -- Convolutional Neural Networks and Expectation Maximization Algorithm -- for Tomographic Reconstruction of Hyperspectral Images
Mads J. Ahlebæk, Mads S. Peters, Wei-Chih Huang, Mads T. Frandsen, René L. Eriksen, Bjarke Jørgensen
Improvements to Supervised EM Learning of Shared Kernel Models by Feature Space Partitioning
Graham W. Pulford