Argumentative Explanation

Argumentative explanation focuses on developing computational methods to understand, generate, and evaluate arguments, aiming to improve the explainability and trustworthiness of AI systems and human-computer interaction. Current research emphasizes the use of argumentation frameworks, often integrated with probabilistic models or large language models, to represent and reason with arguments, employing techniques like dynamic programming, graph-based methods, and inductive logic programming. This work has significant implications for enhancing the transparency of AI decision-making, improving human-computer dialogue, and facilitating more effective communication and debate in various domains.

Papers