Recurrent Network
Recurrent networks are neural network architectures designed to process sequential data by maintaining an internal state that evolves over time, enabling them to capture temporal dependencies. Current research focuses on improving their efficiency and scalability, particularly through novel architectures like state-space models and modifications to classic RNNs (LSTMs, GRUs) that enable parallel training. This renewed interest stems from limitations in transformer models for long sequences and a desire for more biologically plausible learning algorithms, leading to advancements in areas like online learning and applications in diverse fields such as recommender systems, medical image registration, and robotics.
Papers
November 11, 2024
October 31, 2024
October 30, 2024
October 25, 2024
October 17, 2024
October 16, 2024
October 15, 2024
October 9, 2024
October 2, 2024
September 30, 2024
September 18, 2024
September 14, 2024
September 13, 2024
August 29, 2024
July 18, 2024
July 11, 2024
June 19, 2024