Independent Rate
Independent rate research focuses on developing algorithms and models that achieve convergence speeds or approximation accuracy independent of the problem's dimensionality, a crucial factor for handling high-dimensional data common in machine learning. Current efforts concentrate on techniques like zeroth-order optimization, mirror descent, and score-based generative models, aiming to improve efficiency in federated learning, differentially private optimization, and generative modeling. These advancements are significant because they address the "curse of dimensionality," enabling efficient and scalable solutions for complex problems across various fields, including large language model training and optimal transport.
Papers
November 22, 2024
May 24, 2024
March 5, 2024
February 12, 2024
June 29, 2023
May 26, 2023
October 3, 2022
December 3, 2021