论文标题

大约足够好:尺寸和边缘复杂性的概率变体

Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity

论文作者

Kamath, Pritish, Montasser, Omar, Srebro, Nathan

论文摘要

我们提出并研究了维度和边缘复杂性的近似概念,这些概念与给定假设类别所需的最小维度或规范相对应。我们表明,这种概念不仅足以使用线性预测变量或内核学习,而且与确切的变体不同。因此,它们更适合讨论线性或内核方法的局限性。

We present and study approximate notions of dimensional and margin complexity, which correspond to the minimal dimension or norm of an embedding required to approximate, rather then exactly represent, a given hypothesis class. We show that such notions are not only sufficient for learning using linear predictors or a kernel, but unlike the exact variants, are also necessary. Thus they are better suited for discussing limitations of linear or kernel methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源