Gradient Surgery
Gradient surgery is a deep learning optimization technique that improves model training by carefully manipulating gradients from multiple loss functions. Current research focuses on applying gradient surgery to diverse problems, including enhancing the robustness of large language models against adversarial attacks, improving the generalization capabilities of medical image segmentation models, and optimizing generative models for one-shot unlearning. This approach offers a powerful way to balance competing objectives during training, leading to improved model performance and efficiency across various applications, particularly in areas like natural language processing, computer vision, and medical image analysis.
Papers
March 1, 2024
February 23, 2024
February 5, 2024
July 10, 2023
June 26, 2023