Gaussian Process
Gaussian processes (GPs) are probabilistic models used for function approximation and uncertainty quantification, offering a powerful framework for various applications. Current research focuses on extending GPs' capabilities through novel architectures like deep GPs and hybrid models combining GPs with neural networks or other machine learning techniques, addressing scalability and computational efficiency challenges, particularly in high-dimensional or time-varying settings. These advancements are significantly impacting fields like robotics, control systems, and scientific modeling by providing robust, uncertainty-aware predictions and enabling more reliable decision-making in complex systems. The development of efficient algorithms and theoretical analyses further enhances the practical applicability and trustworthiness of GP-based methods.
Papers
Implicit Gaussian process representation of vector fields over arbitrary latent manifolds
Robert L. Peach, Matteo Vinao-Carl, Nir Grossman, Michael David, Emma Mallas, David Sharp, Paresh A. Malhotra, Pierre Vandergheynst, Adam Gosztolai
Comparing Active Learning Performance Driven by Gaussian Processes or Bayesian Neural Networks for Constrained Trajectory Exploration
Sapphira Akins, Frances Zhu
Posterior Contraction Rates for Mat\'ern Gaussian Processes on Riemannian Manifolds
Paul Rosa, Viacheslav Borovitskiy, Alexander Terenin, Judith Rousseau
A spectrum of physics-informed Gaussian processes for regression in engineering
Elizabeth J Cross, Timothy J Rogers, Daniel J Pitchforth, Samuel J Gibson, Matthew R Jones
Reducing the Environmental Impact of Wireless Communication via Probabilistic Machine Learning
A. Ryo Koblitz, Lorenzo Maggi, Matthew Andrews
Resource-Efficient Cooperative Online Scalar Field Mapping via Distributed Sparse Gaussian Process Regression
Tianyi Ding, Ronghao Zheng, Senlin Zhang, Meiqin Liu