论文标题

随机特征模型的概括误差的缓慢恶化

The Slow Deterioration of the Generalization Error of the Random Feature Model

论文作者

Ma, Chao, Wu, Lei, E, Weinan

论文摘要

当参数的数量接近训练样本量时,随机特征模型表现出一种共振行为。该行为的特征是出现较大的概括差距,这是由于相关的革兰氏基质的特征值很小。在本文中,我们研究了该制度中梯度下降算法的动态行为。我们在理论上还是在实验上都表明,工作中存在动态的自我纠正机制:最终的概括差距越大,其发展速度越慢,这既是由于小特征值。这使我们有足够的时间停止培训过程并获得具有良好概括属性的解决方案。

The random feature model exhibits a kind of resonance behavior when the number of parameters is close to the training sample size. This behavior is characterized by the appearance of large generalization gap, and is due to the occurrence of very small eigenvalues for the associated Gram matrix. In this paper, we examine the dynamic behavior of the gradient descent algorithm in this regime. We show, both theoretically and experimentally, that there is a dynamic self-correction mechanism at work: The larger the eventual generalization gap, the slower it develops, both because of the small eigenvalues. This gives us ample time to stop the training process and obtain solutions with good generalization property.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源