Multiple Expert

Multiple expert systems aim to improve performance and efficiency by combining the strengths of several specialized models or human experts. Current research focuses on developing effective routing mechanisms to select the appropriate expert for a given task, exploring various architectures like Mixture-of-Experts (MoE) and their variations (e.g., SoftMoE, HyperMoE), and addressing challenges such as expert collapse, degradation, and underfitting. These advancements have significant implications for diverse fields, improving accuracy and efficiency in tasks ranging from information retrieval and traffic prediction to multi-task learning and human-AI collaboration.

Papers