Paper ID: 2406.10737
Dynamic Domains, Dynamic Solutions: DPCore for Continual Test-Time Adaptation
Yunbei Zhang, Akshay Mehra, Jihun Hamm
Continual Test-Time Adaptation (CTTA) seeks to adapt a source pre-trained model to continually changing, unlabeled target domains. Existing TTA methods are typically designed for environments where domain changes occur sequentially and can struggle in more dynamic scenarios, as illustrated in Figure \ref{fig:settings}. Inspired by the principles of online K-Means, we introduce a novel approach to CTTA through visual prompting. We propose a \emph{Dynamic Prompt Coreset} that not only preserves knowledge from previously visited domains but also accommodates learning from new potential domains. This is complemented by a distance-based \emph{Weight Updating Mechanism} that ensures the coreset remains current and relevant. Our approach employs a fixed model architecture alongside the coreset and an innovative updating system to effectively mitigate challenges such as catastrophic forgetting and error accumulation. Extensive testing on four widely-used benchmarks demonstrates that our method consistently outperforms state-of-the-art alternatives in both classification and segmentation CTTA tasks across the structured and dynamic CTTA settings, with $99\%$ fewer trainable parameters.
Submitted: Jun 15, 2024