Time Series Foundation Model
Time series foundation models aim to create general-purpose models capable of handling diverse time series tasks, such as forecasting and anomaly detection, across various domains without extensive task-specific retraining. Current research emphasizes scaling model size using techniques like Mixture-of-Experts (MoE) to improve accuracy and efficiency, exploring novel architectures such as transformers and MLP-Mixers, and developing effective pre-training and fine-tuning strategies including contrastive learning and parameter-efficient methods like LoRA. These advancements hold significant promise for improving the accuracy and efficiency of time series analysis across numerous scientific fields and practical applications, particularly in areas with limited labeled data.
Papers
Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts
Xu Liu, Juncheng Liu, Gerald Woo, Taha Aksu, Yuxuan Liang, Roger Zimmermann, Chenghao Liu, Silvio Savarese, Caiming Xiong, Doyen Sahoo
GIFT-Eval: A Benchmark For General Time Series Forecasting Model Evaluation
Taha Aksu, Gerald Woo, Juncheng Liu, Xu Liu, Chenghao Liu, Silvio Savarese, Caiming Xiong, Doyen Sahoo