Expert Knowledge
Expert knowledge integration in machine learning aims to leverage human expertise to improve model performance and interpretability, addressing limitations of purely data-driven approaches. Current research focuses on incorporating expert knowledge through various methods, including Mixture-of-Experts (MoE) architectures that combine specialized models for enhanced efficiency and adaptability, and techniques for upcycling pre-trained models to incorporate domain-specific knowledge. These advancements are significant for improving model accuracy, efficiency, and trustworthiness across diverse applications, from medical image analysis to natural language processing and time series forecasting.
Papers
Merging Deep Learning with Expert Knowledge for Seizure Onset Zone localization from rs-fMRI in Pediatric Pharmaco Resistant Epilepsy
Payal Kamboj, Ayan Banerjee, Sandeep K. S. Gupta, Varina L. Boerwinkle
Attention Weighted Mixture of Experts with Contrastive Learning for Personalized Ranking in E-commerce
Juan Gong, Zhenlin Chen, Chaoyi Ma, Zhuojian Xiao, Haonan Wang, Guoyu Tang, Lin Liu, Sulong Xu, Bo Long, Yunjiang Jiang
Incorporating Experts' Judgment into Machine Learning Models
Hogun Park, Aly Megahed, Peifeng Yin, Yuya Ong, Pravar Mahajan, Pei Guo
Master: Meta Style Transformer for Controllable Zero-Shot and Few-Shot Artistic Style Transfer
Hao Tang, Songhua Liu, Tianwei Lin, Shaoli Huang, Fu Li, Dongliang He, Xinchao Wang