Model Sensitivity
Model sensitivity, the degree to which a model's output changes in response to variations in its inputs or parameters, is a crucial area of research across diverse machine learning applications. Current investigations focus on identifying and mitigating sensitivity to spurious correlations, biases (e.g., gender bias in language models), and artifacts in data, employing techniques like spectral analysis, gradient penalization, and surrogate models. Understanding and controlling model sensitivity is vital for improving model robustness, generalization, and reliability, ultimately leading to more trustworthy and effective AI systems in various fields, from medical imaging to climate modeling.
Papers
October 18, 2024
July 16, 2024
April 18, 2024
April 2, 2024
November 10, 2023
October 16, 2023
August 18, 2023
July 27, 2023
March 8, 2023
February 1, 2023
December 1, 2022
October 16, 2022
August 23, 2022