Memory Technology
Memory technology research intensely focuses on improving the speed, energy efficiency, and capacity of memory systems, particularly for accelerating deep neural network (DNN) computations. Current efforts explore novel memory architectures like crossbar arrays using various materials (e.g., ferroelectric transistors, resistive RAM), and in-memory computing approaches that perform computations directly within the memory itself, often employing techniques like partial wordline activation to mitigate hardware limitations. These advancements aim to overcome the von Neumann bottleneck, significantly impacting fields like artificial intelligence and data science by enabling faster and more energy-efficient processing of large datasets.
Papers
October 29, 2024
August 11, 2024
January 17, 2024
May 12, 2023
February 21, 2022