Information Geometry
Information geometry applies geometric concepts to probability distributions, aiming to understand and optimize machine learning and statistical inference processes. Current research focuses on leveraging information geometry's tools, such as Fisher information and Bregman divergences, to improve optimization algorithms (like Adam and natural gradient descent), develop novel generative models, and enhance the interpretability of neural networks by analyzing their parameter evolution during training. This framework offers a powerful mathematical lens for analyzing complex systems, leading to improved model efficiency, robustness, and a deeper understanding of learning dynamics across diverse fields.
Papers
November 5, 2024
October 15, 2024
October 14, 2024
October 1, 2024
August 13, 2024
August 8, 2024
June 7, 2024
June 6, 2024
June 5, 2024
May 26, 2024
May 21, 2024
January 3, 2024
November 17, 2023
October 5, 2023
August 24, 2023
July 28, 2023
July 20, 2023
June 30, 2023
April 19, 2023