SNN Architecture
Spiking Neural Networks (SNNs), inspired by the brain's function, aim to achieve energy-efficient machine learning by processing information as sparse spike trains. Current research focuses on developing efficient SNN architectures through automated search algorithms (like NAS) tailored to hardware constraints (e.g., memory, latency) and specific applications (e.g., image recognition, NLP). These efforts are driven by the need for low-power computation in resource-limited environments like embedded systems and IoT devices, promising significant advancements in energy-efficient artificial intelligence.
Papers
October 12, 2024
June 30, 2024
April 2, 2024
March 12, 2024
February 17, 2024
January 31, 2024
January 15, 2024
November 3, 2023
October 11, 2023
September 12, 2023
June 22, 2023
June 1, 2023
March 3, 2022