Synaptic Change
Synaptic change, the modification of connections between neurons, is a central focus in neuroscience and artificial intelligence, aiming to understand how learning and memory are encoded and implemented. Current research emphasizes developing more biologically plausible artificial neural networks, particularly spiking neural networks (SNNs), by focusing on learning plasticity rules themselves rather than just synaptic weights, and exploring energy-efficient architectures through techniques like synaptic pruning and interspike interval modulation. These advancements hold significant promise for improving the efficiency and adaptability of AI systems, as well as furthering our understanding of biological learning mechanisms in the brain.
Papers
August 6, 2024
April 6, 2024
August 23, 2023
May 26, 2023
November 22, 2022
June 27, 2022