Residual Recurrent Network
Residual recurrent networks (RRNs) combine the recurrent architecture's ability to process sequential data with residual connections to alleviate vanishing/exploding gradient problems and improve training efficiency. Current research focuses on optimizing RRN architectures, such as exploring variations in residual connections to enhance memory properties and expressiveness, and applying them to diverse tasks including speech enhancement, image super-resolution, and natural language processing. These advancements demonstrate RRNs' effectiveness in various domains, leading to improved performance in tasks requiring both sequential processing and deep network architectures.
Papers
July 27, 2023
March 23, 2023
March 14, 2023
November 17, 2022
October 3, 2022
January 2, 2022