Sharpness Aware Optimization
Sharpness-Aware Optimization (SAO) focuses on improving the generalization ability of deep learning models by minimizing the sharpness of the loss landscape around the model's parameters. Current research explores various SAO algorithms, including Sharpness-Aware Minimization (SAM) and its adaptive variants, applying them to diverse architectures like convolutional neural networks and transformers, and within continual learning frameworks. This approach shows promise in enhancing model robustness and performance across various domains, including medical image analysis and time series forecasting, particularly where data scarcity or distribution shifts are significant challenges.
Papers
October 16, 2024
August 7, 2024
June 21, 2024
May 15, 2024
April 1, 2024
February 15, 2024
February 14, 2024
July 5, 2023
May 31, 2023
November 10, 2022