Neural Network Gaussian Process
Neural Network Gaussian Processes (NNGPs) provide a theoretical framework for understanding the behavior of infinitely wide neural networks by representing their output as a Gaussian process. Current research focuses on refining NNGP kernels, comparing them to related kernels like the Neural Tangent Kernel (NTK), and leveraging them for tasks such as Bayesian inference, multiple imputation of missing data, and improved uncertainty quantification in deep learning models. This work bridges the gap between the theoretical analysis of neural networks and their practical application, offering insights into generalization, representation learning, and efficient training strategies.
Papers
June 4, 2024
May 9, 2024
April 26, 2024
March 26, 2024
February 9, 2024
September 18, 2023
September 8, 2023
July 31, 2023
May 17, 2023
February 19, 2023
November 23, 2022
October 12, 2022
June 15, 2022
May 1, 2022
April 30, 2022