Gaussian Mixture Model
Gaussian Mixture Models (GMMs) are probabilistic models used to represent data as a mixture of Gaussian distributions, aiming to identify underlying clusters or patterns within complex datasets. Current research focuses on improving GMM robustness and efficiency, particularly in high-dimensional spaces, through techniques like Expectation-Maximization (EM) algorithms, optimal transport methods, and integration with deep learning architectures such as neural networks and transformers. GMMs find broad application across diverse fields, including robotics (path planning, swarm control), audio processing (denoising, sound event detection), image processing (segmentation, registration), and financial modeling, demonstrating their versatility and impact on various scientific and engineering problems.
Papers
Distance Measure Based on an Embedding of the Manifold of K-Component Gaussian Mixture Models into the Manifold of Symmetric Positive Definite Matrices
Amit Vishwakarma, KS Subrahamanian Moosath
Deep Generative Clustering with VAEs and Expectation-Maximization
Michael Adipoetra, Ségolène Martin
TIMRL: A Novel Meta-Reinforcement Learning Framework for Non-Stationary and Multi-Task Environments
Chenyang Qi, Huiping Li, Panfeng Huang