Multiple Expert
Multiple expert systems aim to improve performance and efficiency by combining the strengths of several specialized models or human experts. Current research focuses on developing effective routing mechanisms to select the appropriate expert for a given task, exploring various architectures like Mixture-of-Experts (MoE) and their variations (e.g., SoftMoE, HyperMoE), and addressing challenges such as expert collapse, degradation, and underfitting. These advancements have significant implications for diverse fields, improving accuracy and efficiency in tasks ranging from information retrieval and traffic prediction to multi-task learning and human-AI collaboration.
Papers
November 1, 2024
October 25, 2024
October 22, 2024
October 2, 2024
September 26, 2024
September 4, 2024
September 2, 2024
August 28, 2024
August 26, 2024
August 19, 2024
August 10, 2024
June 24, 2024
May 23, 2024
March 11, 2024
February 20, 2024
January 31, 2024
December 27, 2023
December 22, 2023
December 12, 2023