Gaussian Mixture Model
Gaussian Mixture Models (GMMs) are probabilistic models used to represent data as a mixture of Gaussian distributions, aiming to identify underlying clusters or patterns within complex datasets. Current research focuses on improving GMM robustness and efficiency, particularly in high-dimensional spaces, through techniques like Expectation-Maximization (EM) algorithms, optimal transport methods, and integration with deep learning architectures such as neural networks and transformers. GMMs find broad application across diverse fields, including robotics (path planning, swarm control), audio processing (denoising, sound event detection), image processing (segmentation, registration), and financial modeling, demonstrating their versatility and impact on various scientific and engineering problems.
Papers
Personalized Federated Learning under Mixture of Distributions
Yue Wu, Shuaicheng Zhang, Wenchao Yu, Yanchi Liu, Quanquan Gu, Dawei Zhou, Haifeng Chen, Wei Cheng
A Spectral Algorithm for List-Decodable Covariance Estimation in Relative Frobenius Norm
Ilias Diakonikolas, Daniel M. Kane, Jasper C. H. Lee, Ankit Pensia, Thanasis Pittas