Private Hyperparameter
Private hyperparameter tuning focuses on selecting optimal settings for machine learning models while preserving the privacy of the training data, a crucial yet often overlooked aspect of differentially private (DP) machine learning. Current research emphasizes developing adaptive optimization methods, such as modifications of DP-SGD and DP-Adam, and exploring techniques like subsampling and automatic clipping to reduce the privacy cost and computational burden of hyperparameter searches. These advancements aim to bridge the gap between the efficiency of non-private hyperparameter optimization and the privacy guarantees required for sensitive data, ultimately improving the practicality and reliability of DP machine learning.
Papers
October 22, 2024
February 20, 2024
June 9, 2023
January 27, 2023
December 8, 2022
June 14, 2022
November 9, 2021