Paper ID: 2308.00720
Divergence of the ADAM algorithm with fixed-stepsize: a (very) simple example
Ph. L. Toint
A very simple unidimensional function with Lipschitz continuous gradient is constructed such that the ADAM algorithm with constant stepsize, started from the origin, diverges when applied to minimize this function in the absence of noise on the gradient. Divergence occurs irrespective of the choice of the method parameters.
Submitted: Aug 1, 2023