Infinite Depth

Research on "infinite depth" in neural networks explores the theoretical and practical implications of networks with an unbounded number of layers, aiming to understand their representational power and computational efficiency. Current work focuses on developing and analyzing novel architectures like infinitely deep Bayesian networks and equilibrium models, along with investigating the properties of their neural tangent kernels. These studies contribute to a deeper understanding of neural network behavior in the infinite-depth limit, potentially leading to more efficient and robust models for various machine learning tasks.

Papers