论文标题

通过记忆效率的半决赛编程启用验证 - 不合时宜的网络认证

Enabling certification of verification-agnostic networks via memory-efficient semidefinite programming

论文作者

Dathathri, Sumanth, Dvijotham, Krishnamurthy, Kurakin, Alexey, Raghunathan, Aditi, Uesato, Jonathan, Bunel, Rudy, Shankar, Shreya, Steinhardt, Jacob, Goodfellow, Ian, Liang, Percy, Kohli, Pushmeet

论文摘要

凸松弛已成为一种有希望的方法,用于验证鲁棒性对对抗性扰动(鲁棒性)的理想特性。广泛使用的线性编程(LP)放松仅在训练网络以促进验证时效果很好。这排除了涉及验证 - 不足网络的应用程序,即未经专门培训的网络进行验证。另一方面,半决赛编程(SDP)松弛已成功地应用于验证 - 不合时宜的网络,但由于时间和空间渐近性差,目前并未超越小网络。在这项工作中,我们提出了一种(1)仅在网络激活总数中线性线性的一阶双SDP算法,(2)仅需要固定数量的前向/向后通过迭代的网络。通过利用迭代特征向量方法,我们以向前和后向网络来表达所有求解器操作,从而有效利用诸如GPU/TPU之类的硬件。对于MNIST和CIFAR-10上的两个验证 - 反应网络,我们分别将L-INF验证的鲁棒精度从1%和88%和6%和40%改善。我们还证明了对变量自动编码器解码器的二次稳定性规范的严格验证。

Convex relaxations have emerged as a promising approach for verifying desirable properties of neural networks like robustness to adversarial perturbations. Widely used Linear Programming (LP) relaxations only work well when networks are trained to facilitate verification. This precludes applications that involve verification-agnostic networks, i.e., networks not specially trained for verification. On the other hand, semidefinite programming (SDP) relaxations have successfully be applied to verification-agnostic networks, but do not currently scale beyond small networks due to poor time and space asymptotics. In this work, we propose a first-order dual SDP algorithm that (1) requires memory only linear in the total number of network activations, (2) only requires a fixed number of forward/backward passes through the network per iteration. By exploiting iterative eigenvector methods, we express all solver operations in terms of forward and backward passes through the network, enabling efficient use of hardware like GPUs/TPUs. For two verification-agnostic networks on MNIST and CIFAR-10, we significantly improve L-inf verified robust accuracy from 1% to 88% and 6% to 40% respectively. We also demonstrate tight verification of a quadratic stability specification for the decoder of a variational autoencoder.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源