Uncertainty Mitigation

Uncertainty mitigation focuses on improving the reliability and trustworthiness of predictions and decisions made by complex systems, particularly in high-stakes applications. Current research emphasizes developing methods to quantify and manage uncertainty in various models, including large language models, generative diffusion models, and knowledge graphs, often employing techniques like ensemble methods, Bayesian approaches, and uncertainty-aware deep learning. This work is crucial for enhancing the safety and dependability of AI systems across diverse fields, from industrial automation and autonomous driving to knowledge representation and engineering design.

Papers