Amortized Inference
Amortized inference aims to speed up Bayesian inference by learning a function that maps data directly to posterior distributions, avoiding computationally expensive repeated calculations for each new data point. Current research focuses on applying this approach to challenging inverse problems, leveraging neural networks (like VAEs and diffusion models), normalizing flows, and equivariant architectures to improve efficiency and accuracy across diverse applications. This methodology is significantly impacting fields like cryo-EM, medical imaging, and causal inference by enabling scalable and efficient analysis of large datasets and complex models where traditional methods are intractable. The resulting speedups and improved generalization capabilities are transforming data analysis in various scientific domains.
Papers
CryoAI: Amortized Inference of Poses for Ab Initio Reconstruction of 3D Molecular Volumes from Real Cryo-EM Images
Axel Levy, Frédéric Poitevin, Julien Martel, Youssef Nashed, Ariana Peck, Nina Miolane, Daniel Ratner, Mike Dunne, Gordon Wetzstein
Variational inference of fractional Brownian motion with linear computational complexity
Hippolyte Verdier, François Laurent, Alhassan Cassé, Christian Vestergaard, Jean-Baptiste Masson