Memory Trace
Memory trace research investigates how information is encoded, stored, and retrieved, focusing on improving the efficiency and accuracy of memory-based processes in various applications. Current research emphasizes enhancing memory capacity and mitigating issues like catastrophic forgetting through techniques such as memory banks, attention mechanisms, and novel neural network architectures (e.g., Transformers, recurrent networks). These advancements have significant implications for improving the performance of machine learning models, particularly in areas like natural language processing, computer vision, and robotics, as well as offering insights into biological memory systems.
Papers
Augment Before Copy-Paste: Data and Memory Efficiency-Oriented Instance Segmentation Framework for Sport-scenes
Chih-Chung Hsu, Chia-Ming Lee, Ming-Shyen Wu
StyleChat: Learning Recitation-Augmented Memory in LLMs for Stylized Dialogue Generation
Jinpeng Li, Zekai Zhang, Quan Tu, Xin Cheng, Dongyan Zhao, Rui Yan