Sharp Bound
"Sharp bounds" research focuses on establishing precise, mathematically rigorous limits on the performance or behavior of various algorithms and models across diverse fields. Current research emphasizes improving existing bounds for algorithms in differential privacy, graph neural networks (particularly poly-GNNs), federated learning (both sequential and parallel), and optimization methods like proximal gradient descent, often analyzing convergence rates and error bounds. These advancements refine theoretical understanding and enable more accurate performance guarantees, impacting areas such as data privacy, machine learning model design, and algorithm verification.
Papers
October 9, 2024
August 22, 2024
July 28, 2024
May 2, 2024
February 22, 2024
December 8, 2023
June 1, 2023
May 26, 2023
November 25, 2022
September 3, 2022
August 19, 2022
August 18, 2022
August 14, 2022
June 14, 2022
May 20, 2022
March 4, 2022