Semantic Uncertainty
Semantic uncertainty, the ambiguity or lack of confidence in the meaning of model outputs, is a critical challenge in various fields, particularly natural language processing and computer vision. Current research focuses on developing methods to quantify and mitigate this uncertainty, employing techniques like Bayesian inference, entropy calculations (including quantum entropy), and graph-based approaches to analyze semantic relationships within model predictions and responses. Addressing semantic uncertainty is crucial for improving the reliability and trustworthiness of AI systems, enabling safer and more robust applications in areas such as autonomous navigation, human-AI interaction, and knowledge-grounded dialogue.
Papers
November 13, 2024
October 30, 2024
July 22, 2024
July 1, 2024
May 30, 2024
March 21, 2024
March 19, 2024
October 28, 2023
May 11, 2023
April 2, 2023
March 20, 2023
February 19, 2023
February 7, 2023
July 20, 2022