Paper ID: 2302.01079

Uncertainty in Fairness Assessment: Maintaining Stable Conclusions Despite Fluctuations

Ainhize Barrainkua, Paula Gordaliza, Jose A. Lozano, Novi Quadrianto

Several recent works encourage the use of a Bayesian framework when assessing performance and fairness metrics of a classification algorithm in a supervised setting. We propose the Uncertainty Matters (UM) framework that generalizes a Beta-Binomial approach to derive the posterior distribution of any criteria combination, allowing stable performance assessment in a bias-aware setting.We suggest modeling the confusion matrix of each demographic group using a Multinomial distribution updated through a Bayesian procedure. We extend UM to be applicable under the popular K-fold cross-validation procedure. Experiments highlight the benefits of UM over classical evaluation frameworks regarding informativeness and stability.

Submitted: Feb 2, 2023