Paper ID: 2410.11648 • Published Oct 15, 2024
Efficient, Accurate and Stable Gradients for Neural ODEs
Sam McCallum, James Foster
TL;DR
Get AI-generated summaries with premium
Get AI-generated summaries with premium
Training Neural ODEs requires backpropagating through an ODE solve. The
state-of-the-art backpropagation method is recursive checkpointing that
balances recomputation with memory cost. Here, we introduce a class of
algebraically reversible ODE solvers that significantly improve upon both the
time and memory cost of recursive checkpointing. The reversible solvers
presented calculate exact gradients, are high-order and numerically stable --
strictly improving on previous reversible architectures.