Contrastive Divergence
Contrastive divergence (CD) is a family of approximate inference methods used to train energy-based models (EBMs), a class of probabilistic models that define probability distributions implicitly through energy functions. Current research focuses on improving the efficiency and effectiveness of CD, including developing novel sampling techniques like diffusion models to replace computationally expensive Markov Chain Monte Carlo methods and exploring alternative loss functions that reduce the reliance on score calculations. These advancements aim to enhance the training and performance of EBMs for various applications, such as image generation and multi-modal learning, by addressing limitations in existing CD implementations and improving the quality of generated samples.