Sequential Model
Sequential models aim to analyze and predict patterns in ordered data, leveraging temporal dependencies to improve accuracy and efficiency. Current research focuses on developing and refining architectures like Mamba (a state space model) and Transformers, addressing challenges such as computational complexity and handling diverse data types (tabular, image, time series). These advancements are impacting various fields, including medical image analysis, recommendation systems, and reinforcement learning, by enabling more accurate predictions and efficient processing of complex sequential data. The development of efficient algorithms for handling long sequences and incorporating multi-modal information is a key area of ongoing investigation.
Papers
Computation-Efficient Era: A Comprehensive Survey of State Space Models in Medical Image Analysis
Moein Heidari, Sina Ghorbani Kolahi, Sanaz Karimijafarbigloo, Bobby Azad, Afshin Bozorgpour, Soheila Hatami, Reza Azad, Ali Diba, Ulas Bagci, Dorit Merhof, Ilker Hacihaliloglu
Exploring User Retrieval Integration towards Large Language Models for Cross-Domain Sequential Recommendation
Tingjia Shen, Hao Wang, Jiaqing Zhang, Sirui Zhao, Liangyue Li, Zulong Chen, Defu Lian, Enhong Chen