Markov Chain
Markov chains, mathematical models representing sequences of events where each event's probability depends only on the preceding event, are fundamental tools with applications across diverse fields. Current research focuses on improving sampling efficiency from complex distributions using algorithms like Gibbs sampling and Metropolis-Hastings, as well as exploring the connections between Markov chains and other models, such as large language models and graph neural networks, to enhance their theoretical understanding and practical performance. This research is significant because it leads to advancements in areas like Bayesian inference, machine learning, and the analysis of complex systems, ultimately improving the accuracy and efficiency of various computational methods.
Papers
Extending Conformal Prediction to Hidden Markov Models with Exact Validity via de Finetti's Theorem for Markov Chains
Buddhika Nettasinghe, Samrat Chatterjee, Ramakrishna Tipireddy, Mahantesh Halappanavar
Functional Central Limit Theorem and Strong Law of Large Numbers for Stochastic Gradient Langevin Dynamics
Attila Lovas, Miklós Rásonyi