Large Time Series Model

Large Time Series Models (LTSMs) aim to create universal forecasting models, analogous to Large Language Models, capable of handling diverse time series data without task-specific training. Current research focuses on transformer-based architectures, exploring optimal pre-training strategies using massive, heterogeneous datasets and investigating techniques like contrastive learning and statistical prompting to improve generalization. This approach promises to significantly improve the efficiency and accuracy of time series analysis across various fields, from finance and healthcare to energy and transportation, by enabling zero-shot and few-shot forecasting capabilities.

Papers