Infinite Width Neural Network

Infinite-width neural networks represent a theoretical framework for analyzing the behavior of very large neural networks by studying their properties as the number of neurons approaches infinity. Current research focuses on understanding the relationship between network depth and sample complexity, the impact of different training algorithms and activation functions on learning dynamics (including the role of the neural tangent kernel), and the equivalence between infinite-width networks and other machine learning models like Gaussian processes and support vector machines. This research provides valuable insights into the generalization ability, robustness, and efficient training of practical, finite-width networks, impacting both theoretical understanding and the development of improved algorithms and architectures.

Papers