Recurrent Dynamic
Recurrent dynamics, encompassing systems whose future state depends on their past, are central to understanding complex temporal processes in diverse fields. Current research focuses on improving the robustness and efficiency of recurrent neural networks (RNNs) for tasks like memory, robotic control, and signal processing, employing architectures such as Transformers, RWKV, and gated linear RNNs with state expansion. These advancements aim to address challenges like catastrophic forgetting and limited expressiveness, leading to more powerful and interpretable models with applications ranging from neuromorphic computing to advanced robotics and signal analysis.
Papers
October 31, 2024
October 9, 2024
July 31, 2024
July 23, 2024
May 2, 2024
April 11, 2024
February 19, 2024
August 22, 2023
July 4, 2023
June 27, 2023
June 23, 2023
May 8, 2022
April 21, 2022
April 8, 2022
January 24, 2022