论文标题

通过互补的平行自我鉴定来增强多标签图像分类

Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation

论文作者

Xu, Jiazhi, Huang, Sheng, Zhou, Fengtao, Huangfu, Luwen, Zeng, Daniel, Liu, Bo

论文摘要

多标签图像分类(MIC)方法通常利用标签相关性以实现良好的性能。但是,强调相关性等相关性可能会忽略目标本身的歧视性特征,并导致模型过度拟合,从而破坏了性能。在这项研究中,我们提出了一个通用框架,称为促进MIC模型的平行自distillation(PSD)。 PSD通过两种详细的互补任务分解策略(CGP)和dis-currence图形分区(DGP)将原始的MIC任务分解为几个更简单的MIC子任务。然后,较少类别的MLIC模型并行培训这些子任务,以分别学习关节模式和标签的特定类别模式。最后,利用知识蒸馏来学习完整类别的紧凑全球合奏,并通过这些学习的模式来调和标签相关性利用和模型过度拟合。 MS-Coco和NUS范围内数据集的广泛结果表明,我们的框架可以轻松地插入许多MICAD方法中,并改善最近最新方法的性能。可解释的视觉研究还进一步验证了我们的方法能够学习特定类别和共同出现的特征。源代码在https://github.com/robbie-xu/cpsd上发布。

Multi-Label Image Classification (MLIC) approaches usually exploit label correlations to achieve good performance. However, emphasizing correlation like co-occurrence may overlook discriminative features of the target itself and lead to model overfitting, thus undermining the performance. In this study, we propose a generic framework named Parallel Self-Distillation (PSD) for boosting MLIC models. PSD decomposes the original MLIC task into several simpler MLIC sub-tasks via two elaborated complementary task decomposition strategies named Co-occurrence Graph Partition (CGP) and Dis-occurrence Graph Partition (DGP). Then, the MLIC models of fewer categories are trained with these sub-tasks in parallel for respectively learning the joint patterns and the category-specific patterns of labels. Finally, knowledge distillation is leveraged to learn a compact global ensemble of full categories with these learned patterns for reconciling the label correlation exploitation and model overfitting. Extensive results on MS-COCO and NUS-WIDE datasets demonstrate that our framework can be easily plugged into many MLIC approaches and improve performances of recent state-of-the-art approaches. The explainable visual study also further validates that our method is able to learn both the category-specific and co-occurring features. The source code is released at https://github.com/Robbie-Xu/CPSD.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源