Paper ID: 2309.14523

Smooth Exact Gradient Descent Learning in Spiking Neural Networks

Christian Klos, Raoul-Martin Memmesheimer

Artificial neural networks are highly successfully trained with backpropagation. For spiking neural networks, however, a similar gradient descent scheme seems prohibitive due to the sudden, disruptive (dis-)appearance of spikes. Here, we demonstrate exact gradient descent learning based on spiking dynamics that change only continuously. These are generated by neuron models whose spikes vanish and appear at the end of a trial, where they do not influence other neurons anymore. This also enables gradient-based spike addition and removal. We apply our learning scheme to induce and continuously move spikes to desired times, in single neurons and recurrent networks. Further, it achieves competitive performance in a benchmark task using deep, initially silent networks. Our results show how non-disruptive learning is possible despite discrete spikes.

Submitted: Sep 25, 2023