Two Layer Neural Network
Two-layer neural networks serve as a fundamental model for understanding the behavior of deeper networks, with research focusing on their optimization dynamics, generalization capabilities, and feature learning properties. Current investigations utilize stochastic gradient descent and related algorithms, often within the context of the neural tangent kernel approximation, to analyze convergence rates and the impact of hyperparameters like learning rate and network width. These studies provide crucial insights into the theoretical foundations of deep learning, informing the design of more efficient and robust algorithms and offering a clearer understanding of phenomena like spectral bias and the emergence of skills during training.
Papers
April 29, 2024
April 26, 2024
April 20, 2024
March 22, 2024
February 25, 2024
February 14, 2024
February 7, 2024
February 5, 2024
February 1, 2024
January 19, 2024
November 21, 2023
October 29, 2023
October 16, 2023
October 12, 2023
October 11, 2023
October 3, 2023
September 26, 2023
September 14, 2023