Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Improving age prediction: Utilizing LSTM-based dynamic forecasting for data augmentation in multivariate time series analysis
Yutong Gao, Charles A. Ellis, Vince D. Calhoun, Robyn L. Miller
Transformer Attractors for Robust and Efficient End-to-End Neural Diarization
Lahiru Samarakoon, Samuel J. Broughton, Marc Härkönen, Ivan Fung
Digital Twin Technology Enabled Proactive Safety Application for Vulnerable Road Users: A Real-World Case Study
Erik Rua, Kazi Hasan Shakib, Sagar Dasgupta, Mizanur Rahman, Steven Jones
StableSSM: Alleviating the Curse of Memory in State-space Models through Stable Reparameterization
Shida Wang, Qianxiao Li