Uncertainty Mitigation
Uncertainty mitigation focuses on improving the reliability and trustworthiness of predictions and decisions made by complex systems, particularly in high-stakes applications. Current research emphasizes developing methods to quantify and manage uncertainty in various models, including large language models, generative diffusion models, and knowledge graphs, often employing techniques like ensemble methods, Bayesian approaches, and uncertainty-aware deep learning. This work is crucial for enhancing the safety and dependability of AI systems across diverse fields, from industrial automation and autonomous driving to knowledge representation and engineering design.
Papers
October 19, 2024
July 15, 2024
June 5, 2024
May 27, 2024
May 22, 2024
January 3, 2024
April 10, 2023
March 7, 2023
April 21, 2022