论文标题

子空间拟合符合回归:监督和正常限制对概括错误的双重下降的影响

Subspace Fitting Meets Regression: The Effects of Supervision and Orthonormality Constraints on Double Descent of Generalization Errors

论文作者

Dar, Yehuda, Mayer, Paul, Luzi, Lorenzo, Baraniuk, Richard G.

论文摘要

我们在过度参数化的环境中研究线性子空间拟合问题,其中估计的子空间可以完美地插入训练示例。我们的范围包括在训练数据中具有不同级别的监督级别的子空间拟合任务的最小二乘解决方案(即,所需的低维映射的输入输出示例的比例)和定义学习操作员的向量的正常范围。这种灵活的问题家族连接了标准的,无监督的子空间拟合,该拟合可以通过完全监督并且不限制线性操作员结构的相应回归任务来实施严格的正态性。这类问题是通过监督 - 正常法平面定义的,在该平面上,每个坐标都会诱导问题实例,并具有独特的监督级别和正常限制的柔软度。我们探索了这架平面,并表明,随着设置变得更加监督和正常限制,相应的子空间拟合问题的概括误差遵循双重下降趋势。

We study the linear subspace fitting problem in the overparameterized setting, where the estimated subspace can perfectly interpolate the training examples. Our scope includes the least-squares solutions to subspace fitting tasks with varying levels of supervision in the training data (i.e., the proportion of input-output examples of the desired low-dimensional mapping) and orthonormality of the vectors defining the learned operator. This flexible family of problems connects standard, unsupervised subspace fitting that enforces strict orthonormality with a corresponding regression task that is fully supervised and does not constrain the linear operator structure. This class of problems is defined over a supervision-orthonormality plane, where each coordinate induces a problem instance with a unique pair of supervision level and softness of orthonormality constraints. We explore this plane and show that the generalization errors of the corresponding subspace fitting problems follow double descent trends as the settings become more supervised and less orthonormally constrained.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源