Permutation Importance
Permutation importance is a model-agnostic technique used to assess the relative importance of input features in machine learning models by measuring the change in prediction accuracy after randomly shuffling feature values. Current research focuses on addressing limitations of standard permutation importance, such as bias from correlated features, by developing methods like conditional permutation importance and incorporating contextual information to improve reliability and reduce variance. These advancements enhance the interpretability of complex models, particularly in high-dimensional settings, leading to improved model understanding, feature selection, and ultimately, more trustworthy and robust predictions across diverse applications like autonomous driving and medical diagnosis.
Papers
Augmented Functional Random Forests: Classifier Construction and Unbiased Functional Principal Components Importance through Ad-Hoc Conditional Permutations
Fabrizio Maturo, Annamaria Porreca
Measuring Variable Importance in Individual Treatment Effect Estimation with High Dimensional Data
Joseph Paillard, Vitaliy Kolodyazhniy, Bertrand Thirion, Denis A. Engemann