Multiple Expert
Multiple expert systems aim to improve performance and efficiency by combining the strengths of several specialized models or human experts. Current research focuses on developing effective routing mechanisms to select the appropriate expert for a given task, exploring various architectures like Mixture-of-Experts (MoE) and their variations (e.g., SoftMoE, HyperMoE), and addressing challenges such as expert collapse, degradation, and underfitting. These advancements have significant implications for diverse fields, improving accuracy and efficiency in tasks ranging from information retrieval and traffic prediction to multi-task learning and human-AI collaboration.
Papers
October 28, 2023
October 25, 2023
October 23, 2023
October 19, 2023
October 15, 2023
September 20, 2023
August 31, 2023
August 29, 2023
February 21, 2023
October 30, 2022
October 8, 2022
September 22, 2022
July 20, 2022
June 16, 2022
May 31, 2022
April 28, 2022
January 26, 2022