Drift Explanation
Concept drift, the phenomenon of changing data distributions over time impacting model performance, is a significant challenge in machine learning. Current research focuses on developing unsupervised drift detection methods, often leveraging deep learning representations and autoencoders to identify and characterize these changes, particularly for applications like cybersecurity. A key area of investigation is "drift explanation," aiming to provide human-understandable descriptions of the underlying distributional shifts, improving model interpretability and facilitating targeted mitigation strategies. This work has implications for improving the robustness and reliability of machine learning systems across diverse applications.
Papers
June 24, 2024
March 18, 2024
March 16, 2023
February 6, 2023
January 19, 2023
June 23, 2022