Independent Rate

Independent rate research focuses on developing algorithms and models that achieve convergence speeds or approximation accuracy independent of the problem's dimensionality, a crucial factor for handling high-dimensional data common in machine learning. Current efforts concentrate on techniques like zeroth-order optimization, mirror descent, and score-based generative models, aiming to improve efficiency in federated learning, differentially private optimization, and generative modeling. These advancements are significant because they address the "curse of dimensionality," enabling efficient and scalable solutions for complex problems across various fields, including large language model training and optimal transport.

Papers