External Sample
External sample research focuses on optimizing the use of limited data in various machine learning tasks, aiming to improve model performance and efficiency with fewer training examples. Current research explores techniques like sample weighting, strategic sampling methods (e.g., hard sample mining, prioritizing easy samples), and novel algorithms for handling noisy or biased samples, often within the context of generative models, neural networks, and reinforcement learning. These advancements are significant because they address the limitations of data scarcity and high computational costs, leading to more efficient and robust machine learning models across diverse applications.
Papers
Oops, I Sampled it Again: Reinterpreting Confidence Intervals in Few-Shot Learning
Raphael Lafargue, Luke Smith, Franck Vermet, Mathias Löwe, Ian Reid, Vincent Gripon, Jack Valmadre
Sample what you cant compress
Vighnesh Birodkar, Gabriel Barcik, James Lyon, Sergey Ioffe, David Minnen, Joshua V. Dillon