Sparse Variational Gaussian Process
Sparse Variational Gaussian Processes (SVGPs) are computationally efficient approximations of Gaussian Processes, designed to handle large datasets by focusing on a subset of representative data points (inducing points). Current research emphasizes improving the efficiency and accuracy of SVGP inference, exploring techniques like natural gradient methods, contaminated normal likelihoods for robust outlier handling, and novel kernel designs such as those incorporating self-attention mechanisms or B-spline basis functions. These advancements enable SVGPs to tackle complex real-world problems, including time series forecasting (e.g., geomagnetic perturbations), voice conversion with limited data, and high-dimensional variable selection, offering a powerful alternative to traditional Gaussian Processes and deep learning methods in various applications.