Time Matter
"Time Matter" encompasses research efforts to effectively incorporate temporal dynamics into various machine learning tasks. Current research focuses on developing novel model architectures, such as recurrent neural networks and transformers adapted for time series analysis, and employing techniques like time-distributed convolutions and Hamiltonian learning to improve temporal modeling. This work is significant because accurately representing and reasoning about time is crucial for improving the performance and reliability of AI systems across diverse applications, from forecasting and risk estimation to medical diagnosis and personalized treatment.
Papers
Time-MMD: A New Multi-Domain Multimodal Dataset for Time Series Analysis
Haoxin Liu, Shangqing Xu, Zhiyuan Zhao, Lingkai Kong, Harshavardhan Kamarthi, Aditya B. Sasanur, Megha Sharma, Jiaming Cui, Qingsong Wen, Chao Zhang, B. Aditya Prakash
The impact of deep learning aid on the workload and interpretation accuracy of radiologists on chest computed tomography: a cross-over reader study
Anvar Kurmukov, Valeria Chernina, Regina Gareeva, Maria Dugova, Ekaterina Petrash, Olga Aleshina, Maxim Pisov, Boris Shirokikh, Valentin Samokhin, Vladislav Proskurov, Stanislav Shimovolos, Maria Basova, Mikhail Goncahrov, Eugenia Soboleva, Maria Donskova, Farukh Yaushev, Alexey Shevtsov, Alexey Zakharov, Talgat Saparov, Victor Gombolevskiy, Mikhail Belyaev
A Gap in Time: The Challenge of Processing Heterogeneous IoT Point Data in Buildings
Xiachong Lin, Arian Prabowo, Imran Razzak, Hao Xue, Matthew Amos, Sam Behrens, Stephen White, Flora D. Salim
Time-FFM: Towards LM-Empowered Federated Foundation Model for Time Series Forecasting
Qingxiang Liu, Xu Liu, Chenghao Liu, Qingsong Wen, Yuxuan Liang