Expert Knowledge
Expert knowledge integration in machine learning aims to leverage human expertise to improve model performance and interpretability, addressing limitations of purely data-driven approaches. Current research focuses on incorporating expert knowledge through various methods, including Mixture-of-Experts (MoE) architectures that combine specialized models for enhanced efficiency and adaptability, and techniques for upcycling pre-trained models to incorporate domain-specific knowledge. These advancements are significant for improving model accuracy, efficiency, and trustworthiness across diverse applications, from medical image analysis to natural language processing and time series forecasting.
Papers
Learning Large-scale Universal User Representation with Sparse Mixture of Experts
Caigao Jiang, Siqiao Xue, James Zhang, Lingyue Liu, Zhibo Zhu, Hongyan Hao
A multi-level interpretable sleep stage scoring system by infusing experts' knowledge into a deep network architecture
Hamid Niknazar, Sara C. Mednick