Mutual Information

Mutual information (MI), a measure of the statistical dependence between variables, is a cornerstone of information theory with broad applications in machine learning and beyond. Current research focuses on developing accurate MI estimators for high-dimensional data and leveraging MI to improve various machine learning tasks, including model training, feature selection, and data representation learning. This involves employing techniques like normalizing flows, variational inference, and contrastive learning within diverse model architectures, such as neural networks and diffusion models. The accurate estimation and effective application of MI are crucial for advancing fields ranging from causal inference and image processing to natural language processing and federated learning.

Papers