Width Network
Width networks, encompassing both infinite-width and large-but-finite-width neural networks, are a key area of research aiming to understand the theoretical underpinnings of deep learning's success. Current investigations focus on the relationship between finite and infinite-width models, exploring architectures like convolutional neural networks (CNNs) and fully connected networks, and analyzing training dynamics using methods such as Neural Tangent Kernels (NTKs) and stochastic differential equations (SDEs). This research clarifies the role of feature learning, the impact of optimization algorithms, and the relationship between network width, generalization error, and the capacity to learn complex functions, ultimately contributing to a more rigorous theoretical foundation for deep learning.