Conditional Mutual Information
Conditional mutual information (CMI) quantifies the information shared between two variables given a third, offering a powerful tool for analyzing dependencies in complex systems. Current research focuses on applying CMI to diverse problems, including feature selection (often employing greedy search algorithms or k-nearest-neighbor methods), improving deep learning models by constraining intra-class concentration and inter-class separation, and developing tighter generalization bounds for machine learning algorithms. These advancements have significant implications for various fields, enabling more efficient data acquisition, improved model interpretability, and a deeper theoretical understanding of generalization in machine learning.
Papers
July 1, 2022
June 29, 2022
November 9, 2021