Core Stability
Core stability, broadly defined as the robustness and reliability of a system or model's performance under various conditions, is a central theme across diverse scientific fields. Current research focuses on improving stability in machine learning models (e.g., through regularization techniques, modified optimization algorithms like AdamG, and analyses of Jacobian alignment), in inverse problems (using methods like Wasserstein gradient flows and data-dependent regularization), and in dynamical systems (by mitigating data-induced instability). Understanding and enhancing core stability is crucial for building reliable and trustworthy systems across applications ranging from medical imaging and AI to robotics and control systems.
Papers
May 20, 2024
May 8, 2024
May 7, 2024
May 6, 2024
May 5, 2024
May 3, 2024
May 2, 2024
April 30, 2024
April 20, 2024
April 13, 2024
April 11, 2024
March 27, 2024
March 4, 2024
February 20, 2024
February 19, 2024
February 17, 2024
February 15, 2024
February 14, 2024
February 7, 2024
February 5, 2024