Adaptive Differential Privacy
Adaptive differential privacy (ADP) enhances traditional differential privacy by dynamically adjusting the level of noise added to data, aiming to optimize the trade-off between privacy protection and data utility. Current research focuses on developing ADP mechanisms within federated learning, adapting algorithms for online convex optimization and contextual bandits, and improving composition theorems for more flexible privacy guarantees. These advancements are significant because they enable more accurate and efficient analyses while maintaining strong privacy protections in various applications, including healthcare and machine learning.
Papers
August 13, 2024
June 26, 2024
January 16, 2024
January 4, 2024
January 2, 2024
February 16, 2023
June 16, 2022
June 15, 2022