Paper ID: 2305.06344

Orthogonal Transforms in Neural Networks Amount to Effective Regularization

Krzysztof Zając, Wojciech Sopot, Paweł Wachel

We consider applications of neural networks in nonlinear system identification and formulate a hypothesis that adjusting general network structure by incorporating frequency information or other known orthogonal transform, should result in an efficient neural network retaining its universal properties. We show that such a structure is a universal approximator and that using any orthogonal transform in a proposed way implies regularization during training by adjusting the learning rate of each parameter individually. We empirically show in particular, that such a structure, using the Fourier transform, outperforms equivalent models without orthogonality support.

Submitted: May 10, 2023