Traditional RNNs
Traditional recurrent neural networks (RNNs) process sequential data by maintaining an internal state that is updated at each time step, aiming to capture temporal dependencies. Current research focuses on improving RNN learnability, particularly for long sequences, exploring architectures like LSTMs and GRUs, and investigating novel designs such as Mamba and RWKV that address limitations in computational efficiency and long-term memory. These efforts are driven by the need for more robust and efficient sequence models with improved interpretability, impacting diverse fields including time series forecasting, natural language processing, and image analysis.
Papers
May 27, 2024
May 26, 2024
May 22, 2024
May 20, 2024
May 10, 2024
April 26, 2024
April 17, 2024
April 9, 2024
April 8, 2024
April 1, 2024
March 25, 2024
March 23, 2024
March 17, 2024
March 3, 2024
February 27, 2024
February 4, 2024
January 17, 2024
December 15, 2023
November 24, 2023