Paper ID: 2112.15545

Training and Generating Neural Networks in Compressed Weight Space

Kazuki Irie, Jürgen Schmidhuber

The inputs and/or outputs of some neural nets are weight matrices of other neural nets. Indirect encodings or end-to-end compression of weight matrices could help to scale such approaches. Our goal is to open a discussion on this topic, starting with recurrent neural networks for character-level language modelling whose weight matrices are encoded by the discrete cosine transform. Our fast weight version thereof uses a recurrent neural network to parameterise the compressed weights. We present experimental results on the enwik8 dataset.

Submitted: Dec 31, 2021