Proxy Based Contrastive Replay

Proxy-based contrastive replay is a technique in continual learning designed to mitigate catastrophic forgetting, the phenomenon where a model forgets previously learned information when adapting to new data. Current research focuses on improving the effectiveness of this approach by incorporating holistic strategies that combine proxy-based and contrastive learning methods, often including components like temperature scaling and knowledge distillation to enhance model stability and learning efficiency. These advancements aim to improve the performance of continual learning models across various tasks, such as relation extraction and reinforcement learning, where the ability to learn incrementally from streaming data is crucial.

Papers