Recurrent Network
Recurrent networks are neural network architectures designed to process sequential data by maintaining an internal state that evolves over time, enabling them to capture temporal dependencies. Current research focuses on improving their efficiency and scalability, particularly through novel architectures like state-space models and modifications to classic RNNs (LSTMs, GRUs) that enable parallel training. This renewed interest stems from limitations in transformer models for long sequences and a desire for more biologically plausible learning algorithms, leading to advancements in areas like online learning and applications in diverse fields such as recommender systems, medical image registration, and robotics.
Papers
June 13, 2024
June 9, 2024
June 3, 2024
May 27, 2024
May 7, 2024
May 4, 2024
May 2, 2024
April 3, 2024
March 24, 2024
March 23, 2024
March 20, 2024
March 19, 2024
March 17, 2024
March 13, 2024
March 12, 2024
March 8, 2024
March 3, 2024
February 26, 2024