Synaptic Plasticity

Synaptic plasticity, the ability of synapses to strengthen or weaken over time, is a fundamental mechanism underlying learning and memory in biological neural networks. Current research focuses on incorporating biologically-inspired plasticity rules into artificial neural networks (ANNs), particularly spiking neural networks (SNNs), to improve their learning efficiency, robustness, and ability to handle continual learning tasks. This involves exploring various model architectures and algorithms, including those based on spike-timing-dependent plasticity (STDP), neuromodulation, and competitive plasticity, to optimize energy efficiency and learning performance. Understanding and replicating synaptic plasticity in artificial systems holds significant promise for advancing artificial intelligence and providing deeper insights into the biological basis of learning and cognition.

Papers