Paper ID: 2402.08151

Gradient-flow adaptive importance sampling for Bayesian leave one out cross-validation with application to sigmoidal classification models

Joshua C Chang, Xiangting Li, Shixin Xu, Hao-Ren Yao, Julia Porcino, Carson Chow

We introduce gradient-flow-guided adaptive importance sampling (IS) transformations for stabilizing Monte-Carlo approximations of leave-one-out (LOO) cross-validated predictions for Bayesian models. After defining two variational problems, we derive corresponding simple nonlinear transformations that utilize gradient information to shift a model's pre-trained full-data posterior closer to the target LOO posterior predictive distributions. In doing so, the transformations stabilize importance weights. The resulting Monte Carlo integrals depend on Jacobian determinants with respect to the model Hessian. We derive closed-form exact formulae for these Jacobian determinants in the cases of logistic regression and shallow ReLU-activated artificial neural networks, and provide a simple approximation that sidesteps the need to compute full Hessian matrices and their spectra. We test the methodology on an $n\ll p$ dataset that is known to produce unstable LOO IS weights.

Submitted: Feb 13, 2024