Associative Memory
Associative memory research explores how systems store and retrieve information based on associations between data points, aiming to understand and replicate this fundamental cognitive ability. Current research focuses on developing and analyzing various model architectures, including Hopfield networks, transformers, and biologically-inspired spiking neural networks, often employing techniques like Hebbian learning and self-attention to improve memory capacity, retrieval accuracy, and sequential learning capabilities. This field is significant for advancing our understanding of biological memory mechanisms and for developing more robust and efficient artificial intelligence systems, particularly in areas like natural language processing and brain-computer interfaces.
Papers
SoftmAP: Software-Hardware Co-design for Integer-Only Softmax on Associative Processors
Mariam Rakka, Jinhao Li, Guohao Dai, Ahmed Eltawil, Mohammed E. Fouda, Fadi Kurdahi
Storing overlapping associative memories on latent manifolds in low-rank spiking networks
William F. Podlaski, Christian K. Machens
Exploiting Memory-aware Q-distribution Prediction for Nuclear Fusion via Modern Hopfield Network
Qingchuan Ma, Shiao Wang, Tong Zheng, Xiaodong Dai, Yifeng Wang, Qingquan Yang, Xiao Wang
Losing dimensions: Geometric memorization in generative diffusion
Beatrice Achilli, Enrico Ventura, Gianluigi Silvestri, Bao Pham, Gabriel Raya, Dmitry Krotov, Carlo Lucibello, Luca Ambrogioni