Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Using Language Model to Bootstrap Human Activity Recognition Ambient Sensors Based in Smart Homes
Damien Bouchabou, Sao Mai Nguyen, Christophe Lohr, Benoit Leduc, Ioannis Kanellos
Appliance Level Short-term Load Forecasting via Recurrent Neural Network
Yuqi Zhou, Arun Sukumaran Nair, David Ganger, Abhinandan Tripathi, Chaitanya Baone, Hao Zhu
pmSensing: A Participatory Sensing Network for Predictive Monitoring of Particulate Matter
Lucas L. S. Sachetti, Enzo B. Cussuol, José Marcos S. Nogueira, Vinicius F. S. Mota
Time Series Prediction about Air Quality using LSTM-Based Models: A Systematic Mapping
Lucas L. S. Sachetti, Vinicius F. S. Mota
Subject-Independent Drowsiness Recognition from Single-Channel EEG with an Interpretable CNN-LSTM model
Jian Cui, Zirui Lan, Tianhu Zheng, Yisi Liu, Olga Sourina, Lipo Wang, Wolfgang Müller-Wittig
Design of an Novel Spectrum Sensing Scheme Based on Long Short-Term Memory and Experimental Validation
Nupur Choudhury, Kandarpa Kumar Sarma, Chinmoy Kalita, Aradhana Misra