Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Long Short-Term Memory to predict 3D Amino acids Positions in GPCR Molecular Dynamics
Juan Manuel López-Correa, Caroline König, Alfredo Vellido
A Deep Learning Network for the Classification of Intracardiac Electrograms in Atrial Tachycardia
Zerui Chen, Sonia Xhyn Teo, Andrie Ochtman, Shier Nee Saw, Nicholas Cheng, Eric Tien Siang Lim, Murphy Lyu, Hwee Kuan Lee
Deep fusion of gray level co-occurrence matrices for lung nodule classification
Ahmed Saihood, Hossein Karshenas, AhmadReza Naghsh Nilchi
Real-time Forecasting of Time Series in Financial Markets Using Sequentially Trained Many-to-one LSTMs
Kelum Gajamannage, Yonggi Park
An Edge-Cloud Integrated Framework for Flexible and Dynamic Stream Analytics
Xin Wang, Azim Khan, Jianwu Wang, Aryya Gangopadhyay, Carl E. Busart, Jade Freeman