Memory Technology

Memory technology research intensely focuses on improving the speed, energy efficiency, and capacity of memory systems, particularly for accelerating deep neural network (DNN) computations. Current efforts explore novel memory architectures like crossbar arrays using various materials (e.g., ferroelectric transistors, resistive RAM), and in-memory computing approaches that perform computations directly within the memory itself, often employing techniques like partial wordline activation to mitigate hardware limitations. These advancements aim to overcome the von Neumann bottleneck, significantly impacting fields like artificial intelligence and data science by enabling faster and more energy-efficient processing of large datasets.

Papers