Device Use Case
On-device machine learning focuses on deploying machine learning models directly onto resource-constrained devices like smartphones and wearables, prioritizing privacy, low latency, and reduced energy consumption. Current research emphasizes developing lightweight model architectures (e.g., CNNs, optimized Transformers, and derivative-free optimization techniques) and efficient algorithms for tasks such as speech recognition, image processing, and personalized language models. This field is significant because it enables new applications in healthcare (sepsis detection), personalized assistance, and real-time translation, while addressing concerns about data privacy and cloud dependency.
Papers
November 15, 2024
November 9, 2024
November 7, 2024
October 24, 2024
October 18, 2024
September 23, 2024
July 31, 2024
July 18, 2024
July 5, 2024
July 1, 2024
June 12, 2024
June 4, 2024
March 3, 2024
February 22, 2024
December 11, 2023
December 8, 2023
November 25, 2023
October 17, 2023
September 14, 2023