Paper ID: 2306.03424
GCD-DDPM: A Generative Change Detection Model Based on Difference-Feature Guided DDPM
Yihan Wen, Xianping Ma, Xiaokang Zhang, Man-On Pun
Deep learning (DL)-based methods have recently shown great promise in bitemporal change detection (CD). Existing discriminative methods based on Convolutional Neural Networks (CNNs) and Transformers rely on discriminative representation learning for change recognition while struggling with exploring local and long-range contextual dependencies. As a result, it is still challenging to obtain fine-grained and robust CD maps in diverse ground scenes. To cope with this challenge, this work proposes a generative change detection model called GCD-DDPM to directly generate CD maps by exploiting the Denoising Diffusion Probabilistic Model (DDPM), instead of classifying each pixel into changed or unchanged categories. Furthermore, the Difference Conditional Encoder (DCE), is designed to guide the generation of CD maps by exploiting multi-level difference features. Leveraging the variational inference (VI) procedure, GCD-DDPM can adaptively re-calibrate the CD results through an iterative inference process, while accurately distinguishing subtle and irregular changes in diverse scenes. Finally, a Noise Suppression-based Semantic Enhancer (NSSE) is specifically designed to mitigate noise in the current step's change-aware feature representations from the CD Encoder. This refinement, serving as an attention map, can guide subsequent iterations while enhancing CD accuracy. Extensive experiments on four high-resolution CD datasets confirm the superior performance of the proposed GCD-DDPM. The code for this work will be available at https://github.com/udrs/GCD.
Submitted: Jun 6, 2023