Posterior Inference
Posterior inference aims to estimate the probability distribution of unknown parameters given observed data, a crucial task in Bayesian statistics and machine learning. Current research heavily focuses on developing efficient algorithms, including Markov Chain Monte Carlo (MCMC) methods, variational inference (VI) techniques, and diffusion models, often combined with neural networks to handle complex, high-dimensional data. These advancements are improving the accuracy and scalability of posterior inference across diverse fields, from robotics and power systems to medical imaging and scientific modeling, enabling more robust uncertainty quantification and improved decision-making.
Papers
Bayesian Additive Main Effects and Multiplicative Interaction Models using Tensor Regression for Multi-environmental Trials
Antonia A. L. Dos Santos, Danilo A. Sarti, Rafael A. Moral, Andrew C. Parnell
Fast and Correct Gradient-Based Optimisation for Probabilistic Programming via Smoothing
Basim Khajwal, C. -H. Luke Ong, Dominik Wagner