Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Social Media as an Instant Source of Feedback on Water Quality
Khubaib Ahmad, Muhammad Asif Ayub, Kashif Ahmad, Jebran Khan, Nasir Ahmad, Ala Al-Fuqaha
Fault Detection and Diagnosis with Imbalanced and Noisy Data: A Hybrid Framework for Rotating Machinery
Masoud Jalayer, Amin Kaboli, Carlotta Orsenigo, Carlo Vercellis
Dual-CLVSA: a Novel Deep Learning Approach to Predict Financial Markets with Sentiment Measurements
Jia Wang, Hongwei Zhu, Jiancheng Shen, Yu Cao, Benyuan Liu
LiteLSTM Architecture for Deep Recurrent Neural Networks
Nelly Elsayed, Zag ElSayed, Anthony S. Maida
Deep Recurrent Learning for Heart Sounds Segmentation based on Instantaneous Frequency Features
Alvaro Joaquín Gaona, Pedro David Arini
Micro-level Reserving for General Insurance Claims using a Long Short-Term Memory Network
Ihsan Chaoubi, Camille Besse, Hélène Cossette, Marie-Pier Côté