Markov Chain

Markov chains, mathematical models representing sequences of events where each event's probability depends only on the preceding event, are fundamental tools with applications across diverse fields. Current research focuses on improving sampling efficiency from complex distributions using algorithms like Gibbs sampling and Metropolis-Hastings, as well as exploring the connections between Markov chains and other models, such as large language models and graph neural networks, to enhance their theoretical understanding and practical performance. This research is significant because it leads to advancements in areas like Bayesian inference, machine learning, and the analysis of complex systems, ultimately improving the accuracy and efficiency of various computational methods.

Papers