Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Forecasting Smog Clouds With Deep Learning
Valentijn Oldenburg, Juan Cardenas-Cartagena, Matias Valdenegro-Toro
Parameter Estimation of Long Memory Stochastic Processes with Deep Neural Networks
Bálint Csanády, Lóránt Nagy, Dániel Boros, Iván Ivkovic, Dávid Kovács, Dalma Tóth-Lakits, László Márkus, András Lukács
Optimizing News Text Classification with Bi-LSTM and Attention Mechanism for Efficient Data Processing
Bingyao Liu, Jiajing Chen, Rui Wang, Junming Huang, Yuanshuai Luo, Jianjun Wei
KARMA: Augmenting Embodied AI Agents with Long-and-short Term Memory Systems
Zixuan Wang, Bo Yu, Junzhe Zhao, Wenhao Sun, Sai Hou, Shuai Liang, Xing Hu, Yinhe Han, Yiming Gan
Achieving Predictive Precision: Leveraging LSTM and Pseudo Labeling for Volvo's Discovery Challenge at ECML-PKDD 2024
Carlo Metta, Marco Gregnanin, Andrea Papini, Silvia Giulia Galfrè, Andrea Fois, Francesco Morandin, Marco Fantozzi, Maurizio Parton
ConvLSTMTransNet: A Hybrid Deep Learning Approach for Internet Traffic Telemetry
Sajal Saha, Saikat Das, Glaucio H.S. Carvalho