Parameterized Model
Parameterized models, particularly overparameterized ones with more parameters than data points, are a central focus in modern machine learning research, aiming to understand their surprising generalization capabilities despite apparent overfitting. Current research investigates the impact of model size, data characteristics, and training algorithms (like stochastic gradient descent and its variants, including accelerated and adaptive methods) on model performance, focusing on architectures such as deep neural networks and kernel regression. This research is crucial for improving the efficiency and robustness of machine learning models across diverse applications, from renewable energy prediction to robust classification in noisy environments.
Papers
UMIX: Improving Importance Weighting for Subpopulation Shift via Uncertainty-Aware Mixup
Zongbo Han, Zhipeng Liang, Fan Yang, Liu Liu, Lanqing Li, Yatao Bian, Peilin Zhao, Bingzhe Wu, Changqing Zhang, Jianhua Yao
Importance Tempering: Group Robustness for Overparameterized Models
Yiping Lu, Wenlong Ji, Zachary Izzo, Lexing Ying