Temporal Information
Temporal information processing focuses on effectively integrating time-dependent data into various computational tasks, aiming to improve accuracy and understanding of dynamic systems. Current research emphasizes the development of models that efficiently capture temporal dependencies, with a focus on transformer architectures, recurrent neural networks (like LSTMs), and graph-based methods for handling complex spatiotemporal relationships in diverse data types, including videos, sensor networks, and time series. This research is significant for advancing fields like video summarization, action recognition, traffic prediction, and medical image analysis, where accurate modeling of temporal dynamics is crucial for improved performance and interpretability.
Papers
Are Large Language Models Temporally Grounded?
Yifu Qiu, Zheng Zhao, Yftah Ziser, Anna Korhonen, Edoardo M. Ponti, Shay B. Cohen
Carpe Diem: On the Evaluation of World Knowledge in Lifelong Language Models
Yujin Kim, Jaehong Yoon, Seonghyeon Ye, Sangmin Bae, Namgyu Ho, Sung Ju Hwang, Se-young Yun
TempTabQA: Temporal Question Answering for Semi-Structured Tables
Vivek Gupta, Pranshu Kandoi, Mahek Bhavesh Vora, Shuo Zhang, Yujie He, Ridho Reinanda, Vivek Srikumar