Recurrent Neural Network
Recurrent Neural Networks (RNNs) are a class of neural networks designed to process sequential data by maintaining an internal state that is updated over time. Current research focuses on improving RNN efficiency and stability, exploring variations like LSTMs and GRUs, and investigating their application in diverse fields such as time series forecasting, natural language processing, and dynamical systems modeling. This includes developing novel architectures like selective state space models for improved memory efficiency and exploring the use of RNNs in conjunction with other architectures, such as transformers and convolutional neural networks. The resulting advancements have significant implications for various applications requiring sequential data processing, offering improved accuracy, efficiency, and interpretability.
Papers
Humans Social Relationship Classification during Accompaniment
Oscar Castro, Ely Repiso, Anais Garrell, Alberto Sanfeliu
Tractable Dendritic RNNs for Reconstructing Nonlinear Dynamical Systems
Manuel Brenner, Florian Hess, Jonas M. Mikhaeil, Leonard Bereska, Zahra Monfared, Po-Chen Kuo, Daniel Durstewitz
Composite FORCE learning of chaotic echo state networks for time-series prediction
Yansong Li, Kai Hu, Kohei Nakajima, Yongping Pan
Ultra-low latency recurrent neural network inference on FPGAs for physics applications with hls4ml
Elham E Khoda, Dylan Rankin, Rafael Teixeira de Lima, Philip Harris, Scott Hauck, Shih-Chieh Hsu, Michael Kagan, Vladimir Loncar, Chaitanya Paikara, Richa Rao, Sioni Summers, Caterina Vernieri, Aaron Wang
Rapid training of quantum recurrent neural networks
Michał Siemaszko, Adam Buraczewski, Bertrand Le Saux, Magdalena Stobińska
An Intensity and Phase Stacked Analysis of Phase-OTDR System using Deep Transfer Learning and Recurrent Neural Networks
Ceyhun Efe Kayan, Kivilcim Yuksel Aldogan, Abdurrahman Gumus
From Tensor Network Quantum States to Tensorial Recurrent Neural Networks
Dian Wu, Riccardo Rossi, Filippo Vicentini, Giuseppe Carleo
Asymptotic Stability in Reservoir Computing
Jonathan Dong, Erik Börve, Mushegh Rafayelyan, Michael Unser
TSFEDL: A Python Library for Time Series Spatio-Temporal Feature Extraction and Prediction using Deep Learning (with Appendices on Detailed Network Architectures and Experimental Cases of Study)
Ignacio Aguilera-Martos, Ángel M. García-Vico, Julián Luengo, Sergio Damas, Francisco J. Melero, José Javier Valle-Alonso, Francisco Herrera