Private Hyperparameter

Private hyperparameter tuning focuses on selecting optimal settings for machine learning models while preserving the privacy of the training data, a crucial yet often overlooked aspect of differentially private (DP) machine learning. Current research emphasizes developing adaptive optimization methods, such as modifications of DP-SGD and DP-Adam, and exploring techniques like subsampling and automatic clipping to reduce the privacy cost and computational burden of hyperparameter searches. These advancements aim to bridge the gap between the efficiency of non-private hyperparameter optimization and the privacy guarantees required for sensitive data, ultimately improving the practicality and reliability of DP machine learning.

Papers