Lipschitz Network

Lipschitz networks are neural networks designed with a bounded Lipschitz constant, ensuring that small input changes result in proportionally small output changes, thereby enhancing robustness against adversarial attacks and noise. Current research focuses on developing architectures and training methods that achieve this constraint while maintaining sufficient expressive power, exploring techniques like optimal transport, spectral normalization, and novel activation functions such as the "sandwich layer" and N-activation. This work is significant because it addresses the critical need for reliable and trustworthy deep learning models in safety-critical applications and improves the theoretical understanding of function approximation capabilities within constrained network architectures.

Papers