Responsibility Allocation

Responsibility allocation, the process of determining who or what is accountable for an outcome, is a rapidly developing field with applications ranging from autonomous driving to human-robot interaction and AI ethics. Current research focuses on developing methods for assigning responsibility in complex multi-agent systems, often employing models like risk chains and control barrier functions to quantify individual contributions and optimize allocation strategies. This work is crucial for ensuring safety, fairness, and accountability in increasingly autonomous systems, impacting both the design of these systems and the legal and ethical frameworks governing their use.

Papers