Paper ID: 2408.09340
Improvement of Bayesian PINN Training Convergence in Solving Multi-scale PDEs with Noise
Yilong Hou, Xi'an Li, Jinran Wu
Bayesian Physics Informed Neural Networks (BPINN) have received considerable attention for inferring differential equations' system states and physical parameters according to noisy observations. However, in practice, Hamiltonian Monte Carlo (HMC) used to estimate the internal parameters of BPINN often encounters troubles, including poor performance and awful convergence for a given step size used to adjust the momentum of those parameters. To improve the efficacy of HMC convergence for the BPINN method and extend its application scope to multi-scale partial differential equations (PDE), we developed a robust multi-scale Bayesian PINN (dubbed MBPINN) method by integrating multi-scale deep neural networks (MscaleDNN) and Bayesian inference. In this newly proposed MBPINN method, we reframe HMC with Stochastic Gradient Descent (SGD) to ensure the most ``likely'' estimation is always provided, and we configure its solver as a Fourier feature mapping-induced MscaleDNN. The MBPINN method offers several key advantages: (1) it is more robust than HMC, (2) it incurs less computational cost than HMC, and (3) it is more flexible for complex problems. We demonstrate the applicability and performance of the proposed method through general Poisson and multi-scale elliptic problems in one- to three-dimensional spaces. Our findings indicate that the proposed method can avoid HMC failures and provide valid results. Additionally, our method can handle complex PDE and produce comparable results for general PDE. These findings suggest that our proposed approach has excellent potential for physics-informed machine learning for parameter estimation and solution recovery in the case of ill-posed problems.
Submitted: Aug 18, 2024