论文标题

利用过多的资源培训神经网络

Utilizing Excess Resources in Training Neural Networks

论文作者

Henig, Amit, Giryes, Raja

论文摘要

在这项工作中,我们建议内核过滤线性过度参数化(KFLO),其中在训练过程中使用了线性过滤层的线性级联,以提高测试时间的网络性能。我们以内核过滤的方式实施了此级联反应,从而防止训练有素的建筑变得不必要。这也允许在几乎所有网络体系结构中使用我们的方法,并在测试时间将过滤层组合到单层中。因此,我们的方法在推断过程中不会增加计算复杂性。我们证明了KFLO在各种网络模型和数据集中的优势在监督学习中。

In this work, we suggest Kernel Filtering Linear Overparameterization (KFLO), where a linear cascade of filtering layers is used during training to improve network performance in test time. We implement this cascade in a kernel filtering fashion, which prevents the trained architecture from becoming unnecessarily deeper. This also allows using our approach with almost any network architecture and let combining the filtering layers into a single layer in test time. Thus, our approach does not add computational complexity during inference. We demonstrate the advantage of KFLO on various network models and datasets in supervised learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源