Tractable Probabilistic Model

Tractable probabilistic models aim to create computationally efficient representations of complex probability distributions, enabling exact inference for tasks like calculating marginal probabilities or generating samples. Current research emphasizes probabilistic circuits (PCs), including variations like sum-of-squares PCs and graphSPNs, focusing on improving their expressiveness and scalability through techniques such as optimized GPU implementations, soft learning, and knowledge distillation from deep generative models. These advancements are significant because they allow for exact inference in applications ranging from image generation and language modeling to reinforcement learning and causal inference, overcoming limitations of intractable models that rely on approximations.

Papers