论文标题

近似连续卷积以进行深网压缩

Approximating Continuous Convolutions for Deep Network Compression

论文作者

Costain, Theo W., Prisacariu, Victor Adrian

论文摘要

我们提出了ActConv,这是一种压缩卷积神经网络层的新方法。将常规离散卷积重新构建为在空间上参数函数的连续卷积,我们使用功能近似值来捕获比常规操作少的参数的CNN过滤器的基本结构。我们的方法能够减少只需少量微调的训练有素的CNN层的大小。我们表明,我们的方法能够将现有的深网模型压缩一半,而精度仅损失1.86%。此外,我们证明我们的方法与其他压缩方法兼容,例如定量,允许进一步减少模型大小。

We present ApproxConv, a novel method for compressing the layers of a convolutional neural network. Reframing conventional discrete convolution as continuous convolution of parametrised functions over space, we use functional approximations to capture the essential structures of CNN filters with fewer parameters than conventional operations. Our method is able to reduce the size of trained CNN layers requiring only a small amount of fine-tuning. We show that our method is able to compress existing deep network models by half whilst losing only 1.86% accuracy. Further, we demonstrate that our method is compatible with other compression methods like quantisation allowing for further reductions in model size.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源