Hierarchical Mixture
Hierarchical mixtures model complex data by combining simpler models in a layered structure, aiming to improve accuracy and efficiency in tasks like classification and anomaly detection. Current research emphasizes developing sophisticated gating mechanisms beyond softmax, exploring diverse expert model types (e.g., Gaussian processes, Generalized Dirichlet classifiers, normalizing flows), and employing variational inference for efficient learning. These advancements enhance the representation power and scalability of hierarchical mixtures, leading to improved performance in various applications including robotics, image processing, and natural language processing.
Papers
Unified Cross-Modal Image Synthesis with Hierarchical Mixture of Product-of-Experts
Reuben Dorent, Nazim Haouchine, Alexandra Golby, Sarah Frisken, Tina Kapur, William Wells
Hierarchical Mixture of Experts: Generalizable Learning for High-Level Synthesis
Weikai Li, Ding Wang, Zijian Ding, Atefeh Sohrabizadeh, Zongyue Qin, Jason Cong, Yizhou Sun