Information Geometry
Information geometry applies geometric concepts to probability distributions, aiming to understand and optimize machine learning and statistical inference processes. Current research focuses on leveraging information geometry's tools, such as Fisher information and Bregman divergences, to improve optimization algorithms (like Adam and natural gradient descent), develop novel generative models, and enhance the interpretability of neural networks by analyzing their parameter evolution during training. This framework offers a powerful mathematical lens for analyzing complex systems, leading to improved model efficiency, robustness, and a deeper understanding of learning dynamics across diverse fields.
Papers
February 7, 2023
November 21, 2022
June 22, 2022
June 21, 2022
June 14, 2022
June 10, 2022
June 9, 2022
April 19, 2022
April 6, 2022
March 20, 2022
March 15, 2022