Style Transformer
Style transformers are a class of neural network models leveraging the transformer architecture to manipulate image styles, primarily for tasks like artistic style transfer, image-to-image translation, and bias mitigation in datasets. Current research focuses on improving efficiency (e.g., parameter sharing in transformer layers), enhancing controllability (e.g., through meta-learning and learnable scaling operations), and addressing challenges like dataset bias (e.g., using diffusion models for synthetic data generation). These advancements are significant for various applications, including image editing, content creation, and improving the fairness and robustness of machine learning models.
Papers
June 10, 2024
April 24, 2023
March 1, 2023
October 17, 2022
March 15, 2022