Backpropagation Through Time
Backpropagation Through Time (BPTT) is a crucial algorithm for training recurrent neural networks (RNNs), particularly those processing sequential data like time series or spiking neural networks (SNNs). Current research focuses on mitigating BPTT's limitations, such as high computational cost and memory requirements, especially for long sequences, through techniques like online learning algorithms (e.g., FPTT, SOLSA) and modifications such as temporal truncation and local training. These efforts aim to improve the efficiency and scalability of training RNNs and SNNs, enabling their application to more complex tasks and larger datasets in fields ranging from fluid dynamics prediction to energy-efficient AI.
Papers
October 23, 2024
July 21, 2023
February 22, 2023
July 6, 2022
April 11, 2022
December 20, 2021
December 13, 2021