Private Optimization
Private optimization aims to train machine learning models while guaranteeing the privacy of individual data points, typically using differential privacy mechanisms. Current research focuses on improving the efficiency and accuracy of differentially private optimizers, such as variants of stochastic gradient descent (SGD), by incorporating techniques like Kalman filtering, low-pass filtering, and adaptive preconditioners to mitigate the negative impact of added noise. These advancements are crucial for deploying machine learning in sensitive domains like healthcare and finance, where data privacy is paramount, and are driving progress in both theoretical understanding and practical applications of differential privacy.
Papers
November 2, 2024
October 9, 2024
October 4, 2024
August 24, 2024
August 12, 2024
May 23, 2024
April 16, 2024
February 21, 2024
November 20, 2023
November 15, 2023
October 30, 2023
July 19, 2023
February 27, 2023
February 24, 2023
February 9, 2023
February 2, 2023
December 3, 2022
December 1, 2022
November 16, 2022