Core Stability
Core stability, broadly defined as the robustness and reliability of a system or model's performance under various conditions, is a central theme across diverse scientific fields. Current research focuses on improving stability in machine learning models (e.g., through regularization techniques, modified optimization algorithms like AdamG, and analyses of Jacobian alignment), in inverse problems (using methods like Wasserstein gradient flows and data-dependent regularization), and in dynamical systems (by mitigating data-induced instability). Understanding and enhancing core stability is crucial for building reliable and trustworthy systems across applications ranging from medical imaging and AI to robotics and control systems.
Papers
February 27, 2022
February 14, 2022
February 1, 2022
January 20, 2022
January 19, 2022
January 18, 2022
December 29, 2021
November 25, 2021
November 22, 2021
November 11, 2021
November 7, 2021
September 13, 2021
August 9, 2021