Style Transfer
Style transfer aims to modify the visual or auditory style of data (images, audio, 3D scenes, text) while preserving its content. Current research focuses on developing efficient and controllable style transfer methods, employing architectures like diffusion models, neural radiance fields, transformers, and Gaussian splatting, often incorporating techniques like attention mechanisms and optimization-based approaches to achieve training-free or few-shot learning. These advancements are impacting diverse fields, including image editing, 3D modeling, audio processing, and natural language processing, by enabling more creative control and efficient manipulation of multimedia data. The development of high-quality, controllable style transfer methods is crucial for applications ranging from artistic expression to medical image analysis.
Papers
Synthetic Latent Fingerprint Generation Using Style Transfer
Amol S. Joshi, Ali Dabouei, Nasser Nasrabadi, Jeremy Dawson
Style Transfer and Self-Supervised Learning Powered Myocardium Infarction Super-Resolution Segmentation
Lichao Wang, Jiahao Huang, Xiaodan Xing, Yinzhe Wu, Ramyah Rajakulasingam, Andrew D. Scott, Pedro F Ferreira, Ranil De Silva, Sonia Nielles-Vallespin, Guang Yang
MOSAIC: Multi-Object Segmented Arbitrary Stylization Using CLIP
Prajwal Ganugula, Y S S S Santosh Kumar, N K Sagar Reddy, Prabhath Chellingi, Avinash Thakur, Neeraj Kasera, C Shyam Anand
MM-NeRF: Multimodal-Guided 3D Multi-Style Transfer of Neural Radiance Field
Zijiang Yang, Zhongwei Qiu, Chang Xu, Dongmei Fu
Specializing Small Language Models towards Complex Style Transfer via Latent Attribute Pre-Training
Ruiqi Xu, Yongfeng Huang, Xin Chen, Lin Zhang
Locally Stylized Neural Radiance Fields
Hong-Wing Pang, Binh-Son Hua, Sai-Kit Yeung
Retinex-guided Channel-grouping based Patch Swap for Arbitrary Style Transfer
Chang Liu, Yi Niu, Mingming Ma, Fu Li, Guangming Shi
In-Style: Bridging Text and Uncurated Videos with Style Transfer for Text-Video Retrieval
Nina Shvetsova, Anna Kukleva, Bernt Schiele, Hilde Kuehne
Enhancing Visual Perception in Novel Environments via Incremental Data Augmentation Based on Style Transfer
Abhibha Gupta, Rully Agus Hendrawan, Mansur Arief
ParaGuide: Guided Diffusion Paraphrasers for Plug-and-Play Textual Style Transfer
Zachary Horvitz, Ajay Patel, Chris Callison-Burch, Zhou Yu, Kathleen McKeown
WSAM: Visual Explanations from Style Augmentation as Adversarial Attacker and Their Influence in Image Classification
Felipe Moreno-Vera, Edgar Medina, Jorge Poco