Low Complexity
Low complexity research focuses on developing efficient algorithms and models that achieve high performance while minimizing computational resources, memory usage, and energy consumption. Current efforts concentrate on designing lightweight neural networks (e.g., using MobileNets, EfficientNets, and spiking neural networks), employing techniques like pruning, quantization, and knowledge distillation, and optimizing existing architectures (e.g., transformers, random forests) for reduced complexity. This pursuit is crucial for deploying advanced machine learning models on resource-constrained devices (e.g., edge devices, IoT) and for improving the scalability and efficiency of large-scale machine learning tasks.
Papers
October 17, 2024
October 16, 2024
October 3, 2024
August 23, 2024
August 20, 2024
July 10, 2024
March 4, 2024
February 17, 2024
January 23, 2024
January 20, 2024
December 21, 2023
December 8, 2023
November 2, 2023
September 27, 2023
September 23, 2023
September 1, 2023
August 9, 2023
May 31, 2023
May 16, 2023