3D Style Transfer
3D style transfer aims to apply the artistic style of a 2D image to a 3D scene, creating stylized 3D renderings. Current research focuses on efficient and generalizable methods, often leveraging neural radiance fields (NeRFs) or Gaussian splatting for scene representation, and incorporating diffusion models or hypernetworks for style transfer. These advancements enable faster processing, improved multi-view consistency, and finer control over the stylization process, including object-specific or semantic-aware style application. This field holds significant potential for enhancing creative content generation, virtual and augmented reality experiences, and applications in fields like digital art and heritage preservation.
Papers
October 26, 2024
October 14, 2024
August 24, 2024
August 8, 2024
July 12, 2024
June 19, 2024
April 23, 2024
April 8, 2024
March 13, 2024
March 12, 2024
February 2, 2024
February 1, 2024
September 24, 2023
July 26, 2023
June 13, 2023
April 24, 2023
March 19, 2023
October 20, 2022
May 9, 2022