Paper ID: 2306.15188

One-class systems seamlessly fit in the forward-forward algorithm

Michael Hopwood

The forward-forward algorithm presents a new method of training neural networks by updating weights during an inference, performing parameter updates for each layer individually. This immediately reduces memory requirements during training and may lead to many more benefits, like seamless online training. This method relies on a loss ("goodness") function that can be evaluated on the activations of each layer, of which can have a varied parameter size, depending on the hyperparamaterization of the network. In the seminal paper, a goodness function was proposed to fill this need; however, if placed in a one-class problem context, one need not pioneer a new loss because these functions can innately handle dynamic network sizes. In this paper, we investigate the performance of deep one-class objective functions when trained in a forward-forward fashion. The code is available at \url{https://github.com/MichaelHopwood/ForwardForwardOneclass}.

Submitted: Jun 27, 2023