Time Matter
"Time Matter" encompasses research efforts to effectively incorporate temporal dynamics into various machine learning tasks. Current research focuses on developing novel model architectures, such as recurrent neural networks and transformers adapted for time series analysis, and employing techniques like time-distributed convolutions and Hamiltonian learning to improve temporal modeling. This work is significant because accurately representing and reasoning about time is crucial for improving the performance and reliability of AI systems across diverse applications, from forecasting and risk estimation to medical diagnosis and personalized treatment.
Papers
Backpropagation through space, time, and the brain
Benjamin Ellenberger, Paul Haider, Jakob Jordan, Kevin Max, Ismael Jaras, Laura Kriener, Federico Benitez, Mihai A. Petrovici
A Semi-Lagrangian Approach for Time and Energy Path Planning Optimization in Static Flow Fields
Víctor C. da S. Campos, Armando A. Neto, Douglas G. Macharet