Exposure Fairness
Exposure fairness research aims to mitigate biases in algorithmic systems that disproportionately affect certain groups, ensuring equitable representation and access to information or opportunities. Current work focuses on developing metrics to quantify bias in ranked lists and recommender systems, exploring algorithms like UCB-based methods for stochastic bandits and fairness-aware Vision Transformers with debiasing techniques, and extending fairness considerations beyond individual items or users to encompass group-level interactions. This research is crucial for addressing societal harms stemming from algorithmic bias, impacting fields like information retrieval, computer vision, and online platforms by promoting more equitable and just outcomes.