论文标题

用自适应辅助学习计数

Counting with Adaptive Auxiliary Learning

论文作者

Meng, Yanda, Bridge, Joshua, Wei, Meng, Zhao, Yitian, Qiao, Yihong, Yang, Xiaoyun, Huang, Xiaowei, Zheng, Yalin

论文摘要

本文提出了一种基于自适应辅助任务学习的方法,以解决对象计数问题。与现有的基于辅助任务学习的方法不同,我们开发了一个适应性的共享骨干网络,以端到端的方式启用任务共享和任务范围的功能学习。该网络无缝将标准卷积神经网络(CNN)和图形卷积网络(GCN)结合在一起,以在不同任务的不同领域之间进行特征提取和特征推理。我们的方法通过迭代和分层融合了自适应CNN主链不同任务分支的功能,从而获得了上下文信息。整个框架特别关注对象的空间位置和多样的密度水平,并由对象(或人群)细分和密度级分段辅助任务告知。特别是,由于提出的扩张的对比密度损失函数,我们的网络从独立于像素的独立和像素依赖性特征学习机制以及增强的鲁棒性方面受益于个人和区域上下文监督。对七个具有挑战性的多域数据集进行的实验表明,我们的方法在基于最先进的辅助任务学习方法上实现了卓越的性能。我们的代码可公开可用:https://github.com/smallmax00/counting_with_adaptive_auxiliary

This paper proposes an adaptive auxiliary task learning based approach for object counting problems. Unlike existing auxiliary task learning based methods, we develop an attention-enhanced adaptively shared backbone network to enable both task-shared and task-tailored features learning in an end-to-end manner. The network seamlessly combines standard Convolution Neural Network (CNN) and Graph Convolution Network (GCN) for feature extraction and feature reasoning among different domains of tasks. Our approach gains enriched contextual information by iteratively and hierarchically fusing the features across different task branches of the adaptive CNN backbone. The whole framework pays special attention to the objects' spatial locations and varied density levels, informed by object (or crowd) segmentation and density level segmentation auxiliary tasks. In particular, thanks to the proposed dilated contrastive density loss function, our network benefits from individual and regional context supervision in terms of pixel-independent and pixel-dependent feature learning mechanisms, along with strengthened robustness. Experiments on seven challenging multi-domain datasets demonstrate that our method achieves superior performance to the state-of-the-art auxiliary task learning based counting methods. Our code is made publicly available at: https://github.com/smallmax00/Counting_With_Adaptive_Auxiliary

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源