Private Optimization

Private optimization aims to train machine learning models while guaranteeing the privacy of individual data points, typically using differential privacy mechanisms. Current research focuses on improving the efficiency and accuracy of differentially private optimizers, such as variants of stochastic gradient descent (SGD), by incorporating techniques like Kalman filtering, low-pass filtering, and adaptive preconditioners to mitigate the negative impact of added noise. These advancements are crucial for deploying machine learning in sensitive domains like healthcare and finance, where data privacy is paramount, and are driving progress in both theoretical understanding and practical applications of differential privacy.

Papers