Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Music Emotion Prediction Using Recurrent Neural Networks
Xinyu Chang, Xiangyu Zhang, Haoruo Zhang, Yulu Ran
A Global Data-Driven Model for The Hippocampus and Nucleus Accumbens of Rat From The Local Field Potential Recordings (LFP)
Maedeh Sadeghi, Mahdi Aliyari Shoorehdeli, Shole jamali, Abbas Haghparast
Physics-based deep learning reveals rising heating demand heightens air pollution in Norwegian cities
Cong Cao, Ramit Debnath, R. Michael Alvarez
xLSTM: Extended Long Short-Term Memory
Maximilian Beck, Korbinian Pöppel, Markus Spanring, Andreas Auer, Oleksandra Prudnikova, Michael Kopp, Günter Klambauer, Johannes Brandstetter, Sepp Hochreiter