Paper ID: 2403.08791

Liquid Resistance Liquid Capacitance Networks

Mónika Farsang, Sophie A. Neubauer, Radu Grosu

We introduce liquid-resistance liquid-capacitance neural networks (LRCs), a neural-ODE model which considerably improve the generalization, accuracy, and biological plausibility of electrical equivalent circuits (EECs), liquid time-constant networks (LTCs), and saturated liquid time-constant networks (STCs), respectively. We also introduce LRC units (LRCUs), as a very efficient and accurate gated RNN-model, which results from solving LRCs with an explicit Euler scheme using just one unfolding. We empirically show and formally prove that the liquid capacitance of LRCs considerably dampens the oscillations of LTCs and STCs, while at the same time dramatically increasing accuracy even for cheap solvers. We experimentally demonstrate that LRCs are a highly competitive alternative to popular neural ODEs and gated RNNs in terms of accuracy, efficiency, and interpretability, on classic time-series benchmarks and a complex autonomous-driving lane-keeping task.

Submitted: Jan 30, 2024