Temporal Point Process
Temporal point processes (TPPs) model the timing and type of events occurring in continuous time, aiming to predict future events based on historical data. Current research heavily utilizes neural network architectures, particularly transformers and recurrent networks, often combined with techniques like diffusion models and contrastive learning, to improve prediction accuracy and efficiency, especially for long-horizon forecasting and high-dimensional data. This field is significant for its applications across diverse domains, including finance, healthcare, and social networks, enabling better understanding and prediction of complex event sequences. Recent work also emphasizes improving model interpretability and addressing challenges like computational cost and handling of irregularly spaced events.
Papers
XTSFormer: Cross-Temporal-Scale Transformer for Irregular-Time Event Prediction in Clinical Applications
Tingsong Xiao, Zelin Xu, Wenchong He, Zhengkun Xiao, Yupu Zhang, Zibo Liu, Shigang Chen, My T. Thai, Jiang Bian, Parisa Rashidi, Zhe Jiang
Unveiling Latent Causal Rules: A Temporal Point Process Approach for Abnormal Event Explanation
Yiling Kuang, Chao Yang, Yang Yang, Shuang Li
Self-Supervised Contrastive Pre-Training for Multivariate Point Processes
Xiao Shou, Dharmashankar Subramanian, Debarun Bhattacharjya, Tian Gao, Kristin P. Bennet
Cumulative Distribution Function based General Temporal Point Processes
Maolin Wang, Yu Pan, Zenglin Xu, Ruocheng Guo, Xiangyu Zhao, Wanyu Wang, Yiqi Wang, Zitao Liu, Langming Liu