Scaling Limit

Scaling limits in machine learning and quantum computing investigate the behavior of systems as key parameters (e.g., network width, depth, or number of qubits) approach infinity. Current research focuses on understanding the dynamics of neural networks (including transformers and ResNets) and quantum algorithms (like Quantum Signal Processing) in these limits, often employing tools from dynamical mean field theory and random matrix theory to derive effective descriptions via differential equations. These studies aim to explain phenomena like hyperparameter transferability and optimal algorithm scaling, ultimately improving model design, training efficiency, and the precision of quantum sensing.

Papers