Neural Network Gaussian Process

Neural Network Gaussian Processes (NNGPs) provide a theoretical framework for understanding the behavior of infinitely wide neural networks by representing their output as a Gaussian process. Current research focuses on refining NNGP kernels, comparing them to related kernels like the Neural Tangent Kernel (NTK), and leveraging them for tasks such as Bayesian inference, multiple imputation of missing data, and improved uncertainty quantification in deep learning models. This work bridges the gap between the theoretical analysis of neural networks and their practical application, offering insights into generalization, representation learning, and efficient training strategies.

Papers