Multi Output Gaussian Process
Multi-output Gaussian processes (MOGPs) are statistical models designed to predict multiple correlated outputs simultaneously, leveraging shared information for improved accuracy and uncertainty quantification. Current research emphasizes addressing challenges like negative transfer (lack of correlation between outputs) and scalability to high-dimensional data, often employing techniques such as sparse covariance matrices, latent variable models, and variational inference for efficient computation. These advancements are improving the applicability of MOGPs across diverse fields, including reinforcement learning, drug discovery, and environmental modeling, by providing robust and uncertainty-aware predictions in complex systems.
Papers
September 5, 2024
September 4, 2024
July 24, 2024
July 2, 2024
June 28, 2024
March 26, 2024
October 30, 2023
August 31, 2023
August 29, 2023
August 5, 2023
June 5, 2023
May 28, 2023
November 19, 2022
June 24, 2022
March 28, 2022