论文标题

样本信息理论概括范围的正式限制

Formal limitations of sample-wise information-theoretic generalization bounds

论文作者

Harutyunyan, Hrayr, Steeg, Greg Ver, Galstyan, Aram

论文摘要

一些最紧密的信息理论概括界限取决于学到的假设和单个培训示例之间的平均信息。但是,这些样本的边界仅是针对预期的泛化差距得出的。我们表明,即使对于预期的平方概括差距,也不存在这样的样本信息理论界限。 Pac-bayes和单绘图界也是如此。值得注意的是,存在依赖于示例成对信息的信息的Pac-bayes,单绘图和预期的平方概括差距。

Some of the tightest information-theoretic generalization bounds depend on the average information between the learned hypothesis and a single training example. However, these sample-wise bounds were derived only for expected generalization gap. We show that even for expected squared generalization gap no such sample-wise information-theoretic bounds exist. The same is true for PAC-Bayes and single-draw bounds. Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源