Information Theoretic Approach
Information-theoretic approaches are increasingly used to analyze and improve machine learning algorithms, focusing on quantifying the information flow between data and model predictions to enhance generalization and efficiency. Current research emphasizes developing tighter generalization bounds using information measures like mutual information and relative entropy, applying these principles to diverse areas such as neural network pruning, clustering, and transfer learning, and designing algorithms that minimize information leakage for privacy preservation. This framework offers valuable tools for understanding model behavior, improving performance, and ensuring robustness across various machine learning tasks and applications.
Papers
November 5, 2024
August 20, 2024
August 14, 2024
July 3, 2024
March 15, 2024
February 1, 2024
August 24, 2023
June 11, 2023
December 20, 2022
November 2, 2022
March 21, 2022
February 18, 2022
January 6, 2022