Artificial Intelligence Audit
Artificial intelligence (AI) auditing systematically evaluates AI systems, their development processes, and their impacts against predefined criteria, aiming to ensure safety, fairness, and compliance. Current research emphasizes developing standardized auditing procedures, encompassing both technical assessments of model properties and process-oriented evaluations of governance structures, with a focus on determining appropriate levels of access to AI systems (ranging from black-box to white-box and beyond) for effective audits. This field is crucial for building trust in AI, informing policy decisions, and mitigating potential societal risks associated with increasingly prevalent AI systems across various sectors.
Papers
October 9, 2024
October 7, 2024
July 7, 2024
May 21, 2024
April 11, 2024
February 26, 2024
January 25, 2024
November 26, 2023
November 15, 2023
May 29, 2023
May 9, 2022
February 16, 2022
November 9, 2021