Gaussian Mixture Model
Gaussian Mixture Models (GMMs) are probabilistic models used to represent data as a mixture of Gaussian distributions, aiming to identify underlying clusters or patterns within complex datasets. Current research focuses on improving GMM robustness and efficiency, particularly in high-dimensional spaces, through techniques like Expectation-Maximization (EM) algorithms, optimal transport methods, and integration with deep learning architectures such as neural networks and transformers. GMMs find broad application across diverse fields, including robotics (path planning, swarm control), audio processing (denoising, sound event detection), image processing (segmentation, registration), and financial modeling, demonstrating their versatility and impact on various scientific and engineering problems.
Papers
A Novel Multivariate Skew-Normal Mixture Model and Its Application in Path-Planning for Very-Large-Scale Robotic Systems
Pingping Zhu, Chang Liu, Peter Estephan
One-Bit Quantization and Sparsification for Multiclass Linear Classification via Regularized Regression
Reza Ghane, Danil Akhtiamov, Babak Hassibi
Gaussian Mixture Models for Affordance Learning using Bayesian Networks
Pedro Osório, Alexandre Bernardino, Ruben Martinez-Cantin, José Santos-Victor
Mixture-Models: a one-stop Python Library for Model-based Clustering using various Mixture Models
Siva Rajesh Kasa, Hu Yijie, Santhosh Kumar Kasa, Vaibhav Rajan
Make BERT-based Chinese Spelling Check Model Enhanced by Layerwise Attention and Gaussian Mixture Model
Yongchang Cao, Liang He, Zhen Wu, Xinyu Dai
Gaussian Mixture Proposals with Pull-Push Learning Scheme to Capture Diverse Events for Weakly Supervised Temporal Video Grounding
Sunoh Kim, Jungchan Cho, Joonsang Yu, YoungJoon Yoo, Jin Young Choi