Spiking Neural Network
Spiking neural networks (SNNs), inspired by the brain's event-driven communication, aim to create energy-efficient artificial intelligence by processing information through binary spikes rather than continuous values. Current research emphasizes improving training efficiency through novel neuron models (e.g., parallel resonate and fire neurons, multi-compartment neurons), developing specialized weight initialization methods, and exploring various coding schemes (e.g., Poisson coding, stepwise weighted spike coding) to optimize performance and reduce energy consumption. This field is significant due to SNNs' potential for low-power applications in embedded systems, neuromorphic computing, and real-time signal processing tasks like robotic manipulation and brain-computer interfaces.
Papers
Uncovering the Representation of Spiking Neural Networks Trained with Surrogate Gradient
Yuhang Li, Youngeun Kim, Hyoungseob Park, Priyadarshini Panda
Parallel Spiking Neurons with High Efficiency and Ability to Learn Long-term Dependencies
Wei Fang, Zhaofei Yu, Zhaokun Zhou, Ding Chen, Yanqi Chen, Zhengyu Ma, Timothée Masquelier, Yonghong Tian