Resource Constrained Device

Resource-constrained devices, such as mobile phones and embedded systems, present unique challenges for deploying computationally intensive machine learning models. Current research focuses on optimizing model architectures (e.g., CNNs, Transformers, Binary Neural Networks) and training techniques (e.g., federated learning, knowledge distillation, quantization, pruning) to reduce memory footprint, power consumption, and inference latency while maintaining accuracy. This work is crucial for enabling on-device AI applications in various fields, including healthcare, agriculture, and mobile computing, where deploying powerful models directly on resource-limited hardware is essential for privacy, efficiency, and real-time responsiveness.

Papers