Training Data
Training data is crucial for machine learning model development, with current research focusing on improving data quality, efficiency, and mitigating biases. Active areas include generating synthetic data to address scarcity or privacy concerns, developing algorithms to optimize data selection and usage (e.g., self-paced learning, active learning), and mitigating issues like data contamination and imbalance through techniques such as data augmentation, selective parameter merging, and novel loss functions. The quality and characteristics of training data significantly impact model performance, generalization, and robustness, influencing various applications from natural language processing and image recognition to scientific computing and medical diagnosis.
Papers
Zero-shot meta-learning for small-scale data from human subjects
Julie Jiang, Kristina Lerman, Emilio Ferrara
Training Compute-Optimal Large Language Models
Jordan Hoffmann, Sebastian Borgeaud, Arthur Mensch, Elena Buchatskaya, Trevor Cai, Eliza Rutherford, Diego de Las Casas, Lisa Anne Hendricks, Johannes Welbl, Aidan Clark, Tom Hennigan, Eric Noland, Katie Millican, George van den Driessche, Bogdan Damoc, Aurelia Guy, Simon Osindero, Karen Simonyan, Erich Elsen, Jack W. Rae, Oriol Vinyals, Laurent Sifre
On the Role of Fixed Points of Dynamical Systems in Training Physics-Informed Neural Networks
Franz M. Rohrhofer, Stefan Posch, Clemens Gößnitzer, Bernhard C. Geiger
WaveFuzz: A Clean-Label Poisoning Attack to Protect Your Voice
Yunjie Ge, Qian Wang, Jingfeng Zhang, Juntao Zhou, Yunzhu Zhang, Chao Shen
Training Data is More Valuable than You Think: A Simple and Effective Method by Retrieving from Training Data
Shuohang Wang, Yichong Xu, Yuwei Fang, Yang Liu, Siqi Sun, Ruochen Xu, Chenguang Zhu, Michael Zeng
Open Set Recognition using Vision Transformer with an Additional Detection Head
Feiyang Cai, Zhenkai Zhang, Jie Liu, Xenofon Koutsoukos
Bridging the Data Gap between Training and Inference for Unsupervised Neural Machine Translation
Zhiwei He, Xing Wang, Rui Wang, Shuming Shi, Zhaopeng Tu