Paper ID: 2303.02189

Interpretable reduced-order modeling with time-scale separation

Sebastian Kaltenbach, Phaedon-Stelios Koutsourelakis, Petros Koumoutsakos

Partial Differential Equations (PDEs) with high dimensionality are commonly encountered in computational physics and engineering. However, finding solutions for these PDEs can be computationally expensive, making model-order reduction crucial. We propose such a data-driven scheme that automates the identification of the time-scales involved and can produce stable predictions forward in time as well as under different initial conditions not included in the training data. To this end, we combine a non-linear autoencoder architecture with a time-continuous model for the latent dynamics in the complex space. It readily allows for the inclusion of sparse and irregularly sampled training data. The learned, latent dynamics are interpretable and reveal the different temporal scales involved. We show that this data-driven scheme can automatically learn the independent processes that decompose a system of linear ODEs along the eigenvectors of the system's matrix. Apart from this, we demonstrate the applicability of the proposed framework in a hidden Markov Model and the (discretized) Kuramoto-Shivashinsky (KS) equation. Additionally, we propose a probabilistic version, which captures predictive uncertainties and further improves upon the results of the deterministic framework.

Submitted: Mar 3, 2023