Device Continual Learning
Device continual learning focuses on enabling edge devices to learn continuously from streaming data without catastrophic forgetting, while adhering to resource constraints like limited memory and processing power. Current research emphasizes efficient algorithms and model architectures, including binary neural networks, vision transformers, and hyperdimensional computing, often incorporating techniques like prompt engineering, memory replay, and quantization to minimize resource usage. This field is crucial for developing truly autonomous and privacy-preserving AI systems in resource-constrained environments, with applications ranging from IoT devices to mobile phones.
Papers
June 13, 2024
June 7, 2024
March 7, 2024
January 18, 2024
August 11, 2023
June 30, 2022
April 15, 2022