Low Complexity
Low complexity research focuses on developing efficient algorithms and models that achieve high performance while minimizing computational resources, memory usage, and energy consumption. Current efforts concentrate on designing lightweight neural networks (e.g., using MobileNets, EfficientNets, and spiking neural networks), employing techniques like pruning, quantization, and knowledge distillation, and optimizing existing architectures (e.g., transformers, random forests) for reduced complexity. This pursuit is crucial for deploying advanced machine learning models on resource-constrained devices (e.g., edge devices, IoT) and for improving the scalability and efficiency of large-scale machine learning tasks.
Papers
September 1, 2023
August 9, 2023
May 31, 2023
May 16, 2023
May 3, 2023
March 4, 2023
February 3, 2023
January 31, 2023
December 8, 2022
November 28, 2022
November 20, 2022
November 14, 2022
November 5, 2022
November 4, 2022
October 19, 2022
October 13, 2022
October 11, 2022
August 28, 2022
August 26, 2022
August 24, 2022