Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Probabilistic Prediction of Longitudinal Trajectory Considering Driving Heterogeneity with Interpretability
Shuli Wang, Kun Gao, Lanfang Zhang, Yang Liu, Lei Chen
Modelling and characterization of fine Particulate Matter dynamics in Bujumbura using low cost sensors
Egide Ndamuzi, Rachel Akimana, Paterne Gahungu, Elie Bimenyimana
Towards Verifiable Text Generation with Evolving Memory and Self-Reflection
Hao Sun, Hengyi Cai, Bo Wang, Yingyan Hou, Xiaochi Wei, Shuaiqiang Wang, Yan Zhang, Dawei Yin
LSTM Network Analysis of Vehicle-Type Fatalities on Great Britain's Roads
Abiodun Finbarrs Oketunji, James Hanify, Salter Heffron-Smith