State Space Model
State space models (SSMs) are a powerful class of models used to represent dynamic systems by tracking their hidden states over time. Current research focuses on developing efficient SSM architectures, such as Mamba and its variants, to overcome limitations of traditional methods in handling long sequences and high-dimensional data, particularly in applications involving time series forecasting, image processing, and dynamic system modeling. These advancements are improving the accuracy and scalability of SSMs across diverse fields, leading to significant improvements in areas like medical image analysis, autonomous driving, and natural language processing. The resulting models offer a compelling alternative to computationally expensive methods like transformers, while maintaining or exceeding performance in many applications.
Papers
Mamba-360: Survey of State Space Models as Transformer Alternative for Long Sequence Modelling: Methods, Applications, and Challenges
Badri Narayana Patro, Vijay Srinivas Agneeswaran
Learning World Models With Hierarchical Temporal Abstractions: A Probabilistic Perspective
Vaisakh Shaj
Bi-Mamba+: Bidirectional Mamba for Time Series Forecasting
Aobo Liang, Xingguo Jiang, Yan Sun, Xiaohou Shi, Ke Li
FusionMamba: Efficient Image Fusion with State Space Model
Siran Peng, Xiangyu Zhu, Haoyu Deng, Zhen Lei, Liang-Jian Deng
SurvMamba: State Space Model with Multi-grained Multi-modal Interaction for Survival Prediction
Ying Chen, Jiajing Xie, Yuxiang Lin, Yuhang Song, Wenxian Yang, Rongshan Yu
DGMamba: Domain Generalization via Generalized State Space Model
Shaocong Long, Qianyu Zhou, Xiangtai Li, Xuequan Lu, Chenhao Ying, Yuan Luo, Lizhuang Ma, Shuicheng Yan