Gaussian Neural Network
Gaussian neural networks (GNNs) leverage Gaussian distributions for network weights, offering advantages in theoretical analysis and practical applications. Current research focuses on improving GNN efficiency through compression techniques like context modeling for 3D Gaussian splatting and exploring the impact of different activation functions (e.g., ReLU) on network behavior and robustness to adversarial attacks, including the development of novel neuron architectures like Finite Gaussian Neurons. These advancements contribute to a deeper understanding of GNN properties, such as their asymptotic behavior and convergence, and enable the development of more robust and efficient models for tasks ranging from image restoration to Bayesian optimization.
Papers
Non-asymptotic approximations of Gaussian neural networks via second-order Poincar\'e inequalities
Alberto Bordino, Stefano Favaro, Sandra Fortini
Infinitely wide limits for deep Stable neural networks: sub-linear, linear and super-linear activation functions
Alberto Bordino, Stefano Favaro, Sandra Fortini