论文标题

鳄鱼:神经网络的可自定义频道修剪和门控

Gator: Customizable Channel Pruning of Neural Networks with Gating

论文作者

Passov, Eli, David, Eli, Netanyahu, Nathan S.

论文摘要

神经网络(NN)应用程序的兴起引起了人们对压缩的兴趣,特别关注通道修剪,这不需要任何其他硬件。大多数修剪方法都采用单层操作或全局方案来确定要删除的通道,然后对网络进行微调。在本文中,我们介绍了gator,这是一种临时添加的渠道修建方法,该方法暂时添加了学习的门控机制,用于修剪单个渠道,并接受了额外的辅助损失培训,旨在降低记忆力(理论上的)加速(以FLOPS的理论)加速(以FLOP)和实用性,硬件,硬件特异性速度来降低计算成本。 Gator引入了NN层之间依赖关系的新公式,与大多数先前的方法相比,该层均可以修剪非序列部分,例如Resnet高速公路上的图层,甚至可以删除整个ResNet块。 Gator在Imagenet上训练的Resnet-50修剪会产生最新的(SOTA)结果,例如减少50%的FLOPS,在TOP-5的准确性中仅仅0.4%。此外,鳄鱼的表现要优于先前的修剪模型,从GPU延迟来运行的速度快1.4倍。此外,与Mobilenetv2和Squeezenet相比,Gator在类似的运行时获得了提高的前5个精度结果。此工作的源代码可在以下网址获得:https://github.com/elipassov/gator。

The rise of neural network (NN) applications has prompted an increased interest in compression, with a particular focus on channel pruning, which does not require any additional hardware. Most pruning methods employ either single-layer operations or global schemes to determine which channels to remove followed by fine-tuning of the network. In this paper we present Gator, a channel-pruning method which temporarily adds learned gating mechanisms for pruning of individual channels, and which is trained with an additional auxiliary loss, aimed at reducing the computational cost due to memory, (theoretical) speedup (in terms of FLOPs), and practical, hardware-specific speedup. Gator introduces a new formulation of dependencies between NN layers which, in contrast to most previous methods, enables pruning of non-sequential parts, such as layers on ResNet's highway, and even removing entire ResNet blocks. Gator's pruning for ResNet-50 trained on ImageNet produces state-of-the-art (SOTA) results, such as 50% FLOPs reduction with only 0.4%-drop in top-5 accuracy. Also, Gator outperforms previous pruning models, in terms of GPU latency by running 1.4 times faster. Furthermore, Gator achieves improved top-5 accuracy results, compared to MobileNetV2 and SqueezeNet, for similar runtimes. The source code of this work is available at: https://github.com/EliPassov/gator.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源