Dependence Shift
Dependence shift, a type of distribution shift in machine learning, refers to changes in the relationship between input features and labels across different datasets or domains. Current research focuses on mitigating the impact of dependence shift on model fairness and generalization, particularly investigating how these shifts affect the performance and interpretability of models like graph neural networks and convolutional neural networks. Addressing dependence shift is crucial for building robust and equitable AI systems, improving the reliability of model interpretations, and ensuring fair predictions across diverse populations and contexts.
Papers
October 17, 2024
April 29, 2024
April 18, 2024
February 7, 2024