Normative Reasoning

Normative reasoning focuses on how agents, both human and artificial, make decisions guided by norms, rules, and ethical considerations. Current research emphasizes developing computational models and frameworks, including those based on deontic logic, answer set programming, and large language models, to represent and reason with normative information, often incorporating constraints and preferences. This work aims to improve decision-making in various contexts, from project selection to the design of ethical and socially responsible AI systems, ultimately contributing to more robust and beneficial applications of artificial intelligence.

Papers