Infinite Depth
Research on "infinite depth" in neural networks explores the theoretical and practical implications of networks with an unbounded number of layers, aiming to understand their representational power and computational efficiency. Current work focuses on developing and analyzing novel architectures like infinitely deep Bayesian networks and equilibrium models, along with investigating the properties of their neural tangent kernels. These studies contribute to a deeper understanding of neural network behavior in the infinite-depth limit, potentially leading to more efficient and robust models for various machine learning tasks.
Papers
February 23, 2024
February 5, 2024
December 20, 2023
November 8, 2023
October 21, 2023
September 11, 2023
July 31, 2023
May 30, 2023
May 26, 2023
March 30, 2023
March 8, 2023
November 8, 2022
October 22, 2022
October 3, 2022
September 21, 2022
April 27, 2022