Shapley Interaction
Shapley Interaction Indices (SIIs) extend the Shapley value, a game theory concept, to quantify the contribution of feature interactions to machine learning model predictions, going beyond individual feature importance. Current research focuses on developing efficient algorithms to compute SIIs, particularly for complex models like graph neural networks and tree ensembles, often employing techniques like weighted least squares optimization and leveraging tree structures for speed improvements. This work is significant for enhancing the interpretability of "black box" models, improving model understanding, and enabling more informed decision-making across diverse applications, from medical diagnosis to financial modeling.
Papers
SHAP-IQ: Unified Approximation of any-order Shapley Interactions
Fabian Fumagalli, Maximilian Muschalik, Patrick Kolpaczki, Eyke Hüllermeier, Barbara Hammer
Understanding and Unifying Fourteen Attribution Methods with Taylor Interactions
Huiqi Deng, Na Zou, Mengnan Du, Weifu Chen, Guocan Feng, Ziwei Yang, Zheyang Li, Quanshi Zhang