Scaling Limit
Scaling limits in machine learning and quantum computing investigate the behavior of systems as key parameters (e.g., network width, depth, or number of qubits) approach infinity. Current research focuses on understanding the dynamics of neural networks (including transformers and ResNets) and quantum algorithms (like Quantum Signal Processing) in these limits, often employing tools from dynamical mean field theory and random matrix theory to derive effective descriptions via differential equations. These studies aim to explain phenomena like hyperparameter transferability and optimal algorithm scaling, ultimately improving model design, training efficiency, and the precision of quantum sensing.
Papers
July 11, 2024
June 17, 2024
May 24, 2024
April 30, 2024
February 27, 2024
November 13, 2023
October 18, 2023
September 28, 2023
July 16, 2023
October 6, 2022
September 22, 2022
June 8, 2022