Long Short Term Memory
Long Short-Term Memory (LSTM) networks are a type of recurrent neural network designed to process sequential data by learning long-term dependencies, enabling accurate predictions and classifications in various applications. Current research focuses on enhancing LSTM architectures, such as incorporating convolutional layers, attention mechanisms, and hybrid models combining LSTMs with other deep learning techniques like transformers or graph neural networks, to improve efficiency and accuracy. This work is significant because LSTMs are proving highly effective across diverse fields, from financial forecasting and environmental monitoring to medical image analysis and activity recognition, offering powerful tools for analyzing complex temporal data.
Papers
Automatic speech recognition for the Nepali language using CNN, bidirectional LSTM and ResNet
Manish Dhakal, Arman Chhetri, Aman Kumar Gupta, Prabin Lamichhane, Suraj Pandey, Subarna Shakya
Research on Education Big Data for Students Academic Performance Analysis based on Machine Learning
Chun Wang, Jiexiao Chen, Ziyang Xie, Jianke Zou
Enhancing IoT Security with CNN and LSTM-Based Intrusion Detection Systems
Afrah Gueriani, Hamza Kheddar, Ahmed Cherif Mazari
Modeling Long Sequences in Bladder Cancer Recurrence: A Comparative Evaluation of LSTM,Transformer,and Mamba
Runquan Zhang, Jiawen Jiang, Xiaoping Shi
Exploring Sectoral Profitability in the Indian Stock Market Using Deep Learning
Jaydip Sen, Hetvi Waghela, Sneha Rakshit