Paper ID: 2205.06915
Formal limitations of sample-wise information-theoretic generalization bounds
Hrayr Harutyunyan, Greg Ver Steeg, Aram Galstyan
Some of the tightest information-theoretic generalization bounds depend on the average information between the learned hypothesis and a single training example. However, these sample-wise bounds were derived only for expected generalization gap. We show that even for expected squared generalization gap no such sample-wise information-theoretic bounds exist. The same is true for PAC-Bayes and single-draw bounds. Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist.
Submitted: May 13, 2022