Video Style Transfer
Video style transfer aims to automatically change the artistic style of video content, mimicking the appearance of paintings, illustrations, or other visual styles. Recent research focuses on improving temporal consistency across video frames, often employing techniques like feature warping, optical flow, and recurrent neural networks within generative adversarial networks (GANs) or diffusion models. These advancements address challenges like preserving content fidelity while achieving stylistic consistency and efficiency, leading to more realistic and artifact-free results. The field holds significant potential for applications in film production, video game development, and creative content generation.
Papers
November 1, 2024
October 5, 2024
October 7, 2023
May 9, 2023
April 22, 2023
March 31, 2023
December 19, 2022