Responsibility Allocation
Responsibility allocation, the process of determining who or what is accountable for an outcome, is a rapidly developing field with applications ranging from autonomous driving to human-robot interaction and AI ethics. Current research focuses on developing methods for assigning responsibility in complex multi-agent systems, often employing models like risk chains and control barrier functions to quantify individual contributions and optimize allocation strategies. This work is crucial for ensuring safety, fairness, and accountability in increasingly autonomous systems, impacting both the design of these systems and the legal and ethical frameworks governing their use.
Papers
October 9, 2024
September 9, 2024
August 4, 2023
June 21, 2023
June 1, 2023
May 22, 2023
March 6, 2023
February 24, 2023