Private Stochastic

Private stochastic optimization aims to train machine learning models while guaranteeing differential privacy, protecting individual data points within training datasets. Current research focuses on improving the efficiency and accuracy of differentially private stochastic gradient descent (DP-SGD) and its variants, exploring techniques like Kalman filtering, low-pass filtering, and adaptive clipping to mitigate the performance degradation caused by noise injection. These advancements are crucial for enabling the responsible use of large datasets in sensitive applications like healthcare and finance, where privacy is paramount, and are driving improvements in both theoretical privacy guarantees and practical training efficiency.

Papers