论文标题

MTP:用于有效语义分割网络的多任务修剪

MTP: Multi-Task Pruning for Efficient Semantic Segmentation Networks

论文作者

Chen, Xinghao, Zhang, Yiman, Wang, Yunhe

论文摘要

本文着重于用于语义分割网络的通道修剪。在分类任务中压缩和加速深层神经网络的先前方法不能直接应用于语义分割网络,该网络涉及通过预训练的隐式多任务学习问题。为了确定分割网络中的冗余,我们提出了一种多任务通道修剪方法。每个卷积过滤器\ wrt的重要性将通过分类和分割任务同时确定任意层的通道。此外,我们开发了一种替代方案,以优化整个网络中过滤器的重要性得分。几个基准的实验结果说明了所提出的算法比最先进的修剪方法的优越性。值得注意的是,我们可以在DeepLabV3上获得约2美元的$ $ flops,而Pascal VOC 2012数据集中只有大约1美元\%$ $ MIOU,并且分别在CityScapes数据集中的$ 1.3 \%$ $ MIOU。

This paper focuses on channel pruning for semantic segmentation networks. Previous methods to compress and accelerate deep neural networks in the classification task cannot be straightforwardly applied to the semantic segmentation network that involves an implicit multi-task learning problem via pre-training. To identify the redundancy in segmentation networks, we present a multi-task channel pruning approach. The importance of each convolution filter \wrt the channel of an arbitrary layer will be simultaneously determined by the classification and segmentation tasks. In addition, we develop an alternative scheme for optimizing importance scores of filters in the entire network. Experimental results on several benchmarks illustrate the superiority of the proposed algorithm over the state-of-the-art pruning methods. Notably, we can obtain an about $2\times$ FLOPs reduction on DeepLabv3 with only an about $1\%$ mIoU drop on the PASCAL VOC 2012 dataset and an about $1.3\%$ mIoU drop on Cityscapes dataset, respectively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源