Backpropagation Through Time

Backpropagation Through Time (BPTT) is a crucial algorithm for training recurrent neural networks (RNNs), particularly those processing sequential data like time series or spiking neural networks (SNNs). Current research focuses on mitigating BPTT's limitations, such as high computational cost and memory requirements, especially for long sequences, through techniques like online learning algorithms (e.g., FPTT, SOLSA) and modifications such as temporal truncation and local training. These efforts aim to improve the efficiency and scalability of training RNNs and SNNs, enabling their application to more complex tasks and larger datasets in fields ranging from fluid dynamics prediction to energy-efficient AI.

Papers