Kernel Ridge Regression
Kernel ridge regression (KRR) is a powerful non-parametric regression technique aiming to learn complex relationships between data by minimizing a regularized empirical risk. Current research focuses on improving KRR's scalability and efficiency for large datasets, including exploring distributed algorithms and low-rank approximations, as well as addressing challenges like parameter selection and covariate shift. These advancements are significant for diverse applications, from genome-wide association studies and computational chemistry to meta-analysis and time series forecasting, enabling more accurate and efficient analyses of high-dimensional data.
Papers
Improved Convergence Rates for Sparse Approximation Methods in Kernel-Based Learning
Sattar Vakili, Jonathan Scarlett, Da-shan Shiu, Alberto Bernacchia
Distribution Regression with Sliced Wasserstein Kernels
Dimitri Meunier, Massimiliano Pontil, Carlo Ciliberto
Learning to Predict Graphs with Fused Gromov-Wasserstein Barycenters
Luc Brogat-Motte, Rémi Flamary, Céline Brouard, Juho Rousu, Florence d'Alché-Buc