Paper ID: 2304.01762

Incorporating Unlabelled Data into Bayesian Neural Networks

Mrinank Sharma, Tom Rainforth, Yee Whye Teh, Vincent Fortuin

Conventional Bayesian Neural Networks (BNNs) are unable to leverage unlabelled data to improve their predictions. To overcome this limitation, we introduce Self-Supervised Bayesian Neural Networks, which use unlabelled data to learn models with suitable prior predictive distributions. This is achieved by leveraging contrastive pretraining techniques and optimising a variational lower bound. We then show that the prior predictive distributions of self-supervised BNNs capture problem semantics better than conventional BNN priors. In turn, our approach offers improved predictive performance over conventional BNNs, especially in low-budget regimes.

Submitted: Apr 4, 2023