论文标题

CNN压缩的张量重新排序

Tensor Reordering for CNN Compression

论文作者

Ulicny, Matej, Krylov, Vladimir A., Dahyot, Rozenn

论文摘要

我们展示了如何通过在光谱结构域中修剪来有效降低卷积神经网络(CNN)过滤器中的参数冗余。具体而言,通过离散余弦变换提取的表示形式比原始空间更有利于修剪。通过依靠重量张量重塑和重新排序的组合,我们只需少量精度损失即可达到高水平的层压缩。我们的方法用于压缩经过预定的CNN,我们表明,较小的额外微调使我们的方法可在大幅度降低参数后恢复原始模型性能。我们验证了用于Imagenet分类任务的Resnet-50和Mobilenet-V2架构的方法。

We show how parameter redundancy in Convolutional Neural Network (CNN) filters can be effectively reduced by pruning in spectral domain. Specifically, the representation extracted via Discrete Cosine Transform (DCT) is more conducive for pruning than the original space. By relying on a combination of weight tensor reshaping and reordering we achieve high levels of layer compression with just minor accuracy loss. Our approach is applied to compress pretrained CNNs and we show that minor additional fine-tuning allows our method to recover the original model performance after a significant parameter reduction. We validate our approach on ResNet-50 and MobileNet-V2 architectures for ImageNet classification task.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源