Paper ID: 2208.03209

Bias and Fairness in Computer Vision Applications of the Criminal Justice System

Sophie Noiret, Jennifer Lumetzberger, Martin Kampel

Discriminatory practices involving AI-driven police work have been the subject of much controversies in the past few years, with algorithms such as COMPAS, PredPol and ShotSpotter being accused of unfairly impacting minority groups. At the same time, the issues of fairness in machine learning, and in particular in computer vision, have been the subject of a growing number of academic works. In this paper, we examine how these area intersect. We provide information on how these practices have come to exist and the difficulties in alleviating them. We then examine three applications currently in development to understand what risks they pose to fairness and how those risks can be mitigated.

Submitted: Aug 5, 2022