Belief Distribution
Belief distribution research focuses on representing and manipulating uncertainty in various decision-making scenarios, aiming to improve the robustness and explainability of intelligent systems. Current research emphasizes developing algorithms and models, such as particle filters, Bayesian belief networks, and neural networks incorporating Theory of Mind, to effectively represent and update belief distributions in complex environments, often leveraging techniques from reinforcement learning and multi-agent systems. This work has significant implications for robotics, human-computer interaction, and other fields requiring agents to make decisions under uncertainty, leading to more adaptable and reliable systems.
Papers
October 11, 2024
October 2, 2024
September 24, 2024
September 13, 2024
July 9, 2024
May 23, 2024
October 4, 2023
September 19, 2023
August 5, 2023
June 27, 2023
May 28, 2023
April 13, 2023
January 21, 2023
November 15, 2022
October 17, 2022
May 23, 2022
March 23, 2022