Analog Accelerator
Analog accelerators aim to improve the energy efficiency and speed of deep neural network training and inference by replacing energy-intensive digital components with analog circuits. Current research focuses on developing novel algorithms (like Equilibrium Propagation and variations of stochastic gradient descent) and hybrid architectures that combine analog and digital components to mitigate the limitations of purely analog approaches, including addressing issues of precision and noise. This work is significant because it could lead to substantial reductions in the energy consumption and cost of AI, enabling broader deployment of powerful AI models in resource-constrained environments.
Papers
October 8, 2024
September 5, 2024
June 18, 2024
February 5, 2024
November 29, 2023
September 19, 2023
September 4, 2023
July 18, 2023
June 15, 2023
January 26, 2022