Paper ID: 2310.15081
E4S: Fine-grained Face Swapping via Editing With Regional GAN Inversion
Maomao Li, Ge Yuan, Cairong Wang, Zhian Liu, Yong Zhang, Yongwei Nie, Jue Wang, Dong Xu
This paper proposes a novel approach to face swapping from the perspective of fine-grained facial editing, dubbed "editing for swapping" (E4S). The traditional face swapping methods rely on global feature extraction and fail to preserve the detailed source identity. In contrast, we propose a Regional GAN Inversion (RGI) method, which allows the explicit disentanglement of shape and texture. Specifically, our E4S performs face swapping in the latent space of a pretrained StyleGAN, where a multi-scale mask-guided encoder is applied to project the texture of each facial component into regional style codes and a mask-guided injection module manipulating feature maps with the style codes. Based on this disentanglement, face swapping can be simplified as style and mask swapping. Besides, due to the large lighting condition gap, transferring the source skin into the target image may lead to disharmony lighting. We propose a re-coloring network to make the swapped face maintain the target lighting condition while preserving the source skin. Further, to deal with the potential mismatch areas during mask exchange, we design a face inpainting module to refine the face shape. The extensive comparisons with state-of-the-art methods demonstrate that our E4S outperforms existing methods in preserving texture, shape, and lighting. Our implementation is available at https://github.com/e4s2024/E4S2024.
Submitted: Oct 23, 2023