Resistive Random Access Memory
Resistive Random Access Memory (RRAM) is being actively researched as a high-performance, energy-efficient hardware substrate for accelerating deep neural networks (DNNs), particularly for edge computing applications. Current research focuses on optimizing DNN architectures and training methods for RRAM's unique characteristics, including addressing hardware limitations like fault tolerance and variability through techniques such as hardware-aware training, error compensation, and novel mapping schemes. This work aims to overcome challenges in RRAM-based in-memory computing, ultimately leading to more efficient and powerful AI systems for various applications.
Papers
Sequence learning in a spiking neuronal network with memristive synapses
Younes Bouhadjar, Sebastian Siegel, Tom Tetzlaff, Markus Diesmann, Rainer Waser, Dirk J. Wouters
Device Modeling Bias in ReRAM-based Neural Network Simulations
Osama Yousuf, Imtiaz Hossen, Matthew W. Daniels, Martin Lueker-Boden, Andrew Dienstfrey, Gina C. Adam