Time Matter
"Time Matter" encompasses research efforts to effectively incorporate temporal dynamics into various machine learning tasks. Current research focuses on developing novel model architectures, such as recurrent neural networks and transformers adapted for time series analysis, and employing techniques like time-distributed convolutions and Hamiltonian learning to improve temporal modeling. This work is significant because accurately representing and reasoning about time is crucial for improving the performance and reliability of AI systems across diverse applications, from forecasting and risk estimation to medical diagnosis and personalized treatment.
Papers
December 7, 2021
December 6, 2021
November 22, 2021
November 17, 2021
November 14, 2021
November 12, 2021
November 6, 2021