Resource Constrained Device
Resource-constrained devices, such as mobile phones and embedded systems, present unique challenges for deploying computationally intensive machine learning models. Current research focuses on optimizing model architectures (e.g., CNNs, Transformers, Binary Neural Networks) and training techniques (e.g., federated learning, knowledge distillation, quantization, pruning) to reduce memory footprint, power consumption, and inference latency while maintaining accuracy. This work is crucial for enabling on-device AI applications in various fields, including healthcare, agriculture, and mobile computing, where deploying powerful models directly on resource-limited hardware is essential for privacy, efficiency, and real-time responsiveness.
Papers
December 14, 2024
December 13, 2024
November 27, 2024
November 25, 2024
November 18, 2024
November 12, 2024
October 14, 2024
October 1, 2024
September 25, 2024
July 1, 2024
May 30, 2024
May 23, 2024
May 8, 2024
May 3, 2024
May 2, 2024
April 24, 2024
April 5, 2024
April 4, 2024
March 23, 2024