Resource Constrained Device
Resource-constrained devices, such as mobile phones and embedded systems, present unique challenges for deploying computationally intensive machine learning models. Current research focuses on optimizing model architectures (e.g., CNNs, Transformers, Binary Neural Networks) and training techniques (e.g., federated learning, knowledge distillation, quantization, pruning) to reduce memory footprint, power consumption, and inference latency while maintaining accuracy. This work is crucial for enabling on-device AI applications in various fields, including healthcare, agriculture, and mobile computing, where deploying powerful models directly on resource-limited hardware is essential for privacy, efficiency, and real-time responsiveness.
Papers
May 7, 2022
March 8, 2022
February 1, 2022
December 21, 2021
December 1, 2021
November 15, 2021