Federated Gaussian Process

Federated Gaussian Processes (FGPs) aim to leverage the power of Gaussian processes for machine learning tasks while preserving data privacy in distributed settings. Current research focuses on developing scalable algorithms, such as those employing deep kernel learning and random features, to enable efficient global model training across multiple clients without directly sharing data. A key trend involves creating personalized models by combining a globally learned prior with locally-trained components, improving both accuracy and adaptability to diverse datasets. This approach holds significant promise for various applications requiring privacy-preserving collaborative learning, particularly in scenarios with limited data per client.

Papers