Paper ID: 2305.07376
DAISM: Digital Approximate In-SRAM Multiplier-based Accelerator for DNN Training and Inference
Lorenzo Sonnino, Shaswot Shresthamali, Yuan He, Masaaki Kondo
DNNs are widely used but face significant computational costs due to matrix multiplications, especially from data movement between the memory and processing units. One promising approach is therefore Processing-in-Memory as it greatly reduces this overhead. However, most PIM solutions rely either on novel memory technologies that have yet to mature or bit-serial computations that have significant performance overhead and scalability issues. Our work proposes an in-SRAM digital multiplier, that uses a conventional memory to perform bit-parallel computations, leveraging multiple wordlines activation. We then introduce DAISM, an architecture leveraging this multiplier, which achieves up to two orders of magnitude higher area efficiency compared to the SOTA counterparts, with competitive energy efficiency.
Submitted: May 12, 2023