论文标题

Lipschitz有限的平衡网络

Lipschitz Bounded Equilibrium Networks

论文作者

Revay, Max, Wang, Ruigang, Manchester, Ian R.

论文摘要

本文介绍了平衡神经网络的新参数化,即由隐式方程定义的网络。该模型类包括标准的多层网络和剩余网络作为特殊情况。新的参数化承认在训练过程中通过不受约束的优化结合了Lipschitz:不需要投影或屏障功能。 Lipschitz边界是鲁棒性的常见代理,并且出现在许多概括范围内。此外,与以前的作品相比,我们在网络权重的限制性较低的条件下表现出适当的(存在解决方案),并且对激活函数的自然假设更为自然:它们是单调和斜率限制的。通过建立凸优化的新连接,在非欧亚人空间上分裂以及收缩的神经ODE来证明这些结果。在图像分类实验中,我们表明Lipschitz边界非常准确,并提高了对对抗攻击的鲁棒性。

This paper introduces new parameterizations of equilibrium neural networks, i.e. networks defined by implicit equations. This model class includes standard multilayer and residual networks as special cases. The new parameterization admits a Lipschitz bound during training via unconstrained optimization: no projections or barrier functions are required. Lipschitz bounds are a common proxy for robustness and appear in many generalization bounds. Furthermore, compared to previous works we show well-posedness (existence of solutions) under less restrictive conditions on the network weights and more natural assumptions on the activation functions: that they are monotone and slope restricted. These results are proved by establishing novel connections with convex optimization, operator splitting on non-Euclidean spaces, and contracting neural ODEs. In image classification experiments we show that the Lipschitz bounds are very accurate and improve robustness to adversarial attacks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源