Nonparametric Regression
Nonparametric regression aims to estimate an unknown function from data without making strong assumptions about its form, focusing on achieving optimal prediction accuracy and understanding the underlying function's properties. Current research emphasizes developing efficient algorithms, including those based on kernel methods, neural networks (especially deep and shallow ReLU networks), and tensor neural networks, often incorporating techniques like regularization, adversarial training, and active learning to improve performance and generalization. These advancements are significant for various fields, enabling more accurate modeling of complex relationships in data and facilitating robust statistical inference in scenarios where parametric assumptions are unrealistic or inappropriate.
Papers
Stable Minima Cannot Overfit in Univariate ReLU Networks: Generalization by Large Step Sizes
Dan Qiao, Kaiqi Zhang, Esha Singh, Daniel Soudry, Yu-Xiang Wang
Optimal Federated Learning for Nonparametric Regression with Heterogeneous Distributed Differential Privacy Constraints
T. Tony Cai, Abhinav Chakraborty, Lasse Vuursteen