论文标题

对具有规模标准化的残留网络进行差异私人培训

Differentially private training of residual networks with scale normalisation

论文作者

Klause, Helena, Ziller, Alexander, Rueckert, Daniel, Hammernik, Kerstin, Kaissis, Georgios

论文摘要

用私人随机梯度下降对神经网络的培训可提供正式的差异隐私保证,但引入了准确的权衡。在这项工作中,我们建议通过简单的体系结构修饰称为Scalenorm在剩余网络中减轻这些权衡,通过该修饰称为Scalenorm,通过该修饰,在残留块的添加操作后引入了附加的归一化层。我们的方法使我们能够进一步改善最近报道的CIFAR-10上最新技术,从头开始训练时,获得了82.5%(ε= 8.0)的前1个准确性(ε= 8.0)。

The training of neural networks with Differentially Private Stochastic Gradient Descent offers formal Differential Privacy guarantees but introduces accuracy trade-offs. In this work, we propose to alleviate these trade-offs in residual networks with Group Normalisation through a simple architectural modification termed ScaleNorm by which an additional normalisation layer is introduced after the residual block's addition operation. Our method allows us to further improve on the recently reported state-of-the art on CIFAR-10, achieving a top-1 accuracy of 82.5% (ε=8.0) when trained from scratch.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源