Spiking Neural Network
Spiking neural networks (SNNs), inspired by the brain's event-driven communication, aim to create energy-efficient artificial intelligence by processing information through binary spikes rather than continuous values. Current research emphasizes improving training efficiency through novel neuron models (e.g., parallel resonate and fire neurons, multi-compartment neurons), developing specialized weight initialization methods, and exploring various coding schemes (e.g., Poisson coding, stepwise weighted spike coding) to optimize performance and reduce energy consumption. This field is significant due to SNNs' potential for low-power applications in embedded systems, neuromorphic computing, and real-time signal processing tasks like robotic manipulation and brain-computer interfaces.
Papers
Towards Chip-in-the-loop Spiking Neural Network Training via Metropolis-Hastings Sampling
Ali Safa, Vikrant Jaltare, Samira Sebt, Kameron Gano, Johannes Leugering, Georges Gielen, Gert Cauwenberghs
Fine-Tuning Surrogate Gradient Learning for Optimal Hardware Performance in Spiking Neural Networks
Ilkin Aliyev, Tosiron Adegbija
Benchmarking Spiking Neural Network Learning Methods with Varying Locality
Jiaqi Lin, Sen Lu, Malyaban Bal, Abhronil Sengupta
Parallel Spiking Unit for Efficient Training of Spiking Neural Networks
Yang Li, Yinqian Sun, Xiang He, Yiting Dong, Dongcheng Zhao, Yi Zeng
LM-HT SNN: Enhancing the Performance of SNN to ANN Counterpart through Learnable Multi-hierarchical Threshold Model
Zecheng Hao, Xinyu Shi, Yujia Liu, Zhaofei Yu, Tiejun Huang