Sequence Modeling
Sequence modeling aims to efficiently process and predict sequential data, a ubiquitous challenge across diverse fields. Current research focuses on improving the efficiency and accuracy of models like Transformers and State Space Models (SSMs), particularly for long sequences, by addressing computational complexities and exploring novel architectures such as Mamba and its variants. These advancements are impacting various applications, including natural language processing, time series forecasting, and reinforcement learning, by enabling more accurate and efficient predictions from complex sequential data. The development of robust and efficient sequence models is crucial for advancing these fields and unlocking new possibilities in data analysis and decision-making.
Papers
Computation-Efficient Era: A Comprehensive Survey of State Space Models in Medical Image Analysis
Moein Heidari, Sina Ghorbani Kolahi, Sanaz Karimijafarbigloo, Bobby Azad, Afshin Bozorgpour, Soheila Hatami, Reza Azad, Ali Diba, Ulas Bagci, Dorit Merhof, Ilker Hacihaliloglu
Efficient User Sequence Learning for Online Services via Compressed Graph Neural Networks
Yucheng Wu, Liyue Chen, Yu Cheng, Shuai Chen, Jinyu Xu, Leye Wang