Oracle Complexity

Oracle complexity studies the minimum number of queries to an oracle (e.g., gradient evaluations) needed to solve optimization problems, focusing on the trade-off between computational cost and memory usage. Current research investigates optimal algorithms for various problem classes, including convex and non-convex optimization, exploring methods like gradient descent, stochastic Halpern iteration, and variance-reduced policy gradients, and analyzing their efficiency across different memory constraints. These analyses are crucial for developing efficient algorithms in machine learning, particularly in deep learning and reinforcement learning, where computational resources are often limited.

Papers