Sharp Bound
"Sharp bounds" research focuses on establishing precise, mathematically rigorous limits on the performance or behavior of various algorithms and models across diverse fields. Current research emphasizes improving existing bounds for algorithms in differential privacy, graph neural networks (particularly poly-GNNs), federated learning (both sequential and parallel), and optimization methods like proximal gradient descent, often analyzing convergence rates and error bounds. These advancements refine theoretical understanding and enable more accurate performance guarantees, impacting areas such as data privacy, machine learning model design, and algorithm verification.
17papers
Papers
December 11, 2024
October 9, 2024
August 22, 2024
February 22, 2024
December 8, 2023
June 1, 2023
November 25, 2022
September 3, 2022
August 14, 2022
June 14, 2022