Lazy Learning
Lazy learning is a machine learning paradigm focusing on updating model parameters only when necessary, unlike traditional methods that update on every data point. Current research explores lazy learning's application across diverse areas, including online learning with privacy constraints, neural network training inspired by biological learning mechanisms, and large language model analysis to understand shortcut learning. This approach offers significant advantages in terms of computational efficiency, reduced energy consumption, and improved performance in specific contexts, particularly with large datasets or resource-limited environments.
Papers
June 5, 2024
October 12, 2023
May 26, 2023
March 26, 2023
March 21, 2022