Rate Reduction
Rate reduction, a core concept in information theory and machine learning, aims to minimize data representation size while preserving essential information. Current research focuses on applying rate reduction techniques to improve the efficiency and performance of various models, including spiking neural networks, deep metric learning architectures, and transformers, often leveraging methods like maximal coding rate reduction (MCR²) and sparse rate reduction. These advancements are significant because they address computational bottlenecks in large-scale models, enhance model robustness, and enable efficient deployment in resource-constrained environments, such as mobile robotics and federated learning applications.
Papers
September 4, 2024
July 3, 2024
June 10, 2024
June 4, 2024
February 27, 2024
November 22, 2023
June 1, 2023
April 25, 2023
March 15, 2023
February 21, 2023
November 14, 2022
October 1, 2022
August 27, 2022
March 31, 2022
January 31, 2022