Time Matter
"Time Matter" encompasses research efforts to effectively incorporate temporal dynamics into various machine learning tasks. Current research focuses on developing novel model architectures, such as recurrent neural networks and transformers adapted for time series analysis, and employing techniques like time-distributed convolutions and Hamiltonian learning to improve temporal modeling. This work is significant because accurately representing and reasoning about time is crucial for improving the performance and reliability of AI systems across diverse applications, from forecasting and risk estimation to medical diagnosis and personalized treatment.
Papers
Time Awareness in Large Language Models: Benchmarking Fact Recall Across Time
David Herel, Vojtech Bartek, Tomas Mikolov
Time Distributed Deep Learning models for Purely Exogenous Forecasting. Application to Water Table Depth Prediction using Weather Image Time Series
Matteo Salis, Abdourrahmane M. Atto, Stefano Ferraris, Rosa Meo