Expert Knowledge
Expert knowledge integration in machine learning aims to leverage human expertise to improve model performance and interpretability, addressing limitations of purely data-driven approaches. Current research focuses on incorporating expert knowledge through various methods, including Mixture-of-Experts (MoE) architectures that combine specialized models for enhanced efficiency and adaptability, and techniques for upcycling pre-trained models to incorporate domain-specific knowledge. These advancements are significant for improving model accuracy, efficiency, and trustworthiness across diverse applications, from medical image analysis to natural language processing and time series forecasting.
Papers
A Survey on Mixture of Experts
Weilin Cai, Juyong Jiang, Fan Wang, Jing Tang, Sunghun Kim, Jiayi Huang
Mixture of Experts in a Mixture of RL settings
Timon Willi, Johan Obando-Ceron, Jakob Foerster, Karolina Dziugaite, Pablo Samuel Castro
SC-MoE: Switch Conformer Mixture of Experts for Unified Streaming and Non-streaming Code-Switching ASR
Shuaishuai Ye, Shunfei Chen, Xinhui Hu, Xinkang Xu
Deriving Hematological Disease Classes Using Fuzzy Logic and Expert Knowledge: A Comprehensive Machine Learning Approach with CBC Parameters
Salem Ameen, Ravivarman Balachandran, Theodoros Theodoridis
Variational Distillation of Diffusion Policies into Mixture of Experts
Hongyi Zhou, Denis Blessing, Ge Li, Onur Celik, Xiaogang Jia, Gerhard Neumann, Rudolf Lioutikov
Style Mixture of Experts for Expressive Text-To-Speech Synthesis
Ahad Jawaid, Shreeram Suresh Chandra, Junchen Lu, Berrak Sisman
Node-wise Filtering in Graph Neural Networks: A Mixture of Experts Approach
Haoyu Han, Juanhui Li, Wei Huang, Xianfeng Tang, Hanqing Lu, Chen Luo, Hui Liu, Jiliang Tang
Continual Traffic Forecasting via Mixture of Experts
Sanghyun Lee, Chanyoung Park