Normative Reasoning
Normative reasoning focuses on how agents, both human and artificial, make decisions guided by norms, rules, and ethical considerations. Current research emphasizes developing computational models and frameworks, including those based on deontic logic, answer set programming, and large language models, to represent and reason with normative information, often incorporating constraints and preferences. This work aims to improve decision-making in various contexts, from project selection to the design of ethical and socially responsible AI systems, ultimately contributing to more robust and beneficial applications of artificial intelligence.
Papers
August 19, 2024
June 6, 2024
May 22, 2024
May 21, 2024
March 25, 2024
March 10, 2024
January 25, 2024
December 4, 2023
August 30, 2023
August 21, 2023
May 27, 2023
May 26, 2023
February 17, 2023
October 6, 2022
September 12, 2022
July 6, 2022