Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Neural Speech Enhancement with Very Low Algorithmic Latency and Complexity via Integrated Full- and Sub-Band Modeling
Zhong-Qiu Wang, Samuele Cornell, Shukjae Choi, Younglo Lee, Byeong-Yeol Kim, Shinji Watanabe
Human activity recognition using deep learning approaches and single frame cnn and convolutional lstm
Sheryl Mathew, Annapoorani Subramanian, Pooja, Balamurugan MS, Manoj Kumar Rajagopal
Using LSTM and GRU With a New Dataset for Named Entity Recognition in the Arabic Language
Alaa Shaker, Alaa Aldarf, Igor Bessmertny
Deep Long-Short Term Memory networks: Stability properties and Experimental validation
Fabio Bonassi, Alessio La Bella, Giulio Panzani, Marcello Farina, Riccardo Scattolini
In Search of Deep Learning Architectures for Load Forecasting: A Comparative Analysis and the Impact of the Covid-19 Pandemic on Model Performance
Sotiris Pelekis, Evangelos Karakolis, Francisco Silva, Vasileios Schoinas, Spiros Mouzakitis, Georgios Kormpakis, Nuno Amaro, John Psarras
A Preliminary Study on Pattern Reconstruction for Optimal Storage of Wearable Sensor Data
Sazia Mahfuz, Farhana Zulkernine