Paper ID: 2201.00632

Neural network training under semidefinite constraints

Patricia Pauli, Niklas Funcke, Dennis Gramlich, Mohamed Amine Msalmi, Frank Allgöwer

This paper is concerned with the training of neural networks (NNs) under semidefinite constraints, which allows for NN training with robustness and stability guarantees. In particular, we focus on Lipschitz bounds for NNs. Exploiting the banded structure of the underlying matrix constraint, we set up an efficient and scalable training scheme for NN training problems of this kind based on interior point methods. Our implementation allows to enforce Lipschitz constraints in the training of large-scale deep NNs such as Wasserstein generative adversarial networks (WGANs) via semidefinite constraints. In numerical examples, we show the superiority of our method and its applicability to WGAN training.

Submitted: Jan 3, 2022