Conditional Mutual Information
Conditional mutual information (CMI) quantifies the information shared between two variables given a third, offering a powerful tool for analyzing dependencies in complex systems. Current research focuses on applying CMI to diverse problems, including feature selection (often employing greedy search algorithms or k-nearest-neighbor methods), improving deep learning models by constraining intra-class concentration and inter-class separation, and developing tighter generalization bounds for machine learning algorithms. These advancements have significant implications for various fields, enabling more efficient data acquisition, improved model interpretability, and a deeper theoretical understanding of generalization in machine learning.
Papers
November 18, 2024
October 30, 2024
August 3, 2024
July 18, 2024
March 9, 2024
February 14, 2024
January 16, 2024
October 17, 2023
October 12, 2023
September 17, 2023
September 8, 2023
June 5, 2023
May 23, 2023
May 18, 2023
April 9, 2023
February 5, 2023
November 20, 2022
October 12, 2022