Neuromorphic Hardware
Neuromorphic hardware aims to build computing systems inspired by the brain's architecture, prioritizing energy efficiency and speed for tasks like image and speech processing. Current research focuses on developing and training spiking neural networks (SNNs), often using biologically plausible learning rules like STDP and employing architectures such as transformers and recurrent networks, with a strong emphasis on efficient mapping to neuromorphic chips. This field is significant because it promises to overcome limitations of traditional computing in power-constrained applications, leading to advancements in areas like edge AI, robotics, and brain-computer interfaces.
Papers
Efficient Sparse Coding with the Adaptive Locally Competitive Algorithm for Speech Classification
Soufiyan Bahadi, Eric Plourde, Jean Rouat
Training Spiking Neural Networks via Augmented Direct Feedback Alignment
Yongbo Zhang, Katsuma Inoue, Mitsumasa Nakajima, Toshikazu Hashimoto, Yasuo Kuniyoshi, Kohei Nakajima