Hebbian Model

The Hebbian model posits that learning occurs through the strengthening of connections between simultaneously active neurons, a principle reflected in various plasticity rules like Spike-Timing-Dependent Plasticity (STDP). Current research focuses on applying Hebbian learning to artificial neural networks, particularly recurrent neural networks (RNNs) and spiking neural networks (SNNs), to achieve efficient continual learning and address challenges like catastrophic forgetting. This work is significant for developing more biologically plausible and energy-efficient AI systems, as well as providing insights into the mechanisms of learning and memory in biological brains.

Papers