论文标题
SQ下限用于学习MassArt噪声的单个神经元
SQ Lower Bounds for Learning Single Neurons with Massart Noise
论文作者
论文摘要
我们研究PAC在存在Massart噪声的情况下学习单个神经元的问题。具体而言,对于已知的激活函数$ f:\ mathbb {r} \ to \ mathbb {r} $,让学习者可以访问标记的示例$(\ Mathbf {x},y),y in \ mathbb {r}相应的标签$ y $是$ f(\ langle \ mathbf {w},\ mathbf {x} \ rangle)$的Massart损坏。学习者的目的是输出假设$ h:\ mathbb {r}^d \ to \ mathbb {r} $,并带有小平方损失。对于包括Relus在内的一系列激活函数,我们为该学习问题建立了超多项式统计查询(SQ)下限。更详细地,我们证明没有有效的SQ算法可以在任何常数因子内近似最佳误差。我们的主要技术贡献是一种新颖的SQ-HARD结构,用于学习$ \ {\ pm 1 \} $ - Boole Hyperean HyperCube上的Massart Halfspaces,本身就很有趣。
We study the problem of PAC learning a single neuron in the presence of Massart noise. Specifically, for a known activation function $f: \mathbb{R} \to \mathbb{R}$, the learner is given access to labeled examples $(\mathbf{x}, y) \in \mathbb{R}^d \times \mathbb{R}$, where the marginal distribution of $\mathbf{x}$ is arbitrary and the corresponding label $y$ is a Massart corruption of $f(\langle \mathbf{w}, \mathbf{x} \rangle)$. The goal of the learner is to output a hypothesis $h: \mathbb{R}^d \to \mathbb{R}$ with small squared loss. For a range of activation functions, including ReLUs, we establish super-polynomial Statistical Query (SQ) lower bounds for this learning problem. In more detail, we prove that no efficient SQ algorithm can approximate the optimal error within any constant factor. Our main technical contribution is a novel SQ-hard construction for learning $\{ \pm 1\}$-weight Massart halfspaces on the Boolean hypercube that is interesting on its own right.