Paper ID: 2206.09054
Learning the parameters of a differential equation from its trajectory via the adjoint equation
Imre Fekete, András Molnár, Péter L. Simon
The paper contributes to strengthening the relation between machine learning and the theory of differential equations. In this context, the inverse problem of fitting the parameters, and the initial condition of a differential equation to some measurements constitutes a key issue. The paper explores an abstraction that can be used to construct a family of loss functions with the aim of fitting the solution of an initial value problem to a set of discrete or continuous measurements. It is shown, that an extension of the adjoint equation can be used to derive the gradient of the loss function as a continuous analogue of backpropagation in machine learning. Numerical evidence is presented that under reasonably controlled circumstances the gradients obtained this way can be used in a gradient descent to fit the solution of an initial value problem to a set of continuous noisy measurements, and a set of discrete noisy measurements that are recorded at uncertain times.
Submitted: Jun 17, 2022