论文标题
学习亲和力 - 为深度图像垫子的采样
Learning Affinity-Aware Upsampling for Deep Image Matting
论文作者
论文摘要
我们表明,UPPLAPLING中的学习亲和力提供了一种有效,有效的方法来利用深网中的成对相互作用。二阶特征通常用于密集的预测中,以在上采样(例如非本地块)之后与可学习的模块建立相邻关系。由于上采样是必不可少的,因此在Upsmpling中学习亲和力可以避免额外的传播层,从而提供了建立紧凑型模型的潜力。通过从统一的数学角度查看现有的Upsmpling运算符,我们将其推广到二阶形式,并引入Affinity-Aware Upsmpling(A2U),其中使用轻量重量的Lowrank双线性模型生成Upsmpripsmpling内核,并以二阶功能为条件。我们的降采样操作员也可以扩展到下采样。我们讨论了A2U的替代实现,并验证了它们对两个细节敏感任务的有效性:玩具数据集中的图像重建;以及一个大刻刻图像效果任务,基于亲和力的想法构成了主流贴图方法。尤其是,组成1K垫数据集的结果表明,A2U在SAD度量标准方面相对改善,而参数可以忽略不计(<0.5%)。与最先进的疗程网络相比,我们的性能高8%,仅40%模型复杂性。
We show that learning affinity in upsampling provides an effective and efficient approach to exploit pairwise interactions in deep networks. Second-order features are commonly used in dense prediction to build adjacent relations with a learnable module after upsampling such as non-local blocks. Since upsampling is essential, learning affinity in upsampling can avoid additional propagation layers, offering the potential for building compact models. By looking at existing upsampling operators from a unified mathematical perspective, we generalize them into a second-order form and introduce Affinity-Aware Upsampling (A2U) where upsampling kernels are generated using a light-weight lowrank bilinear model and are conditioned on second-order features. Our upsampling operator can also be extended to downsampling. We discuss alternative implementations of A2U and verify their effectiveness on two detail-sensitive tasks: image reconstruction on a toy dataset; and a largescale image matting task where affinity-based ideas constitute mainstream matting approaches. In particular, results on the Composition-1k matting dataset show that A2U achieves a 14% relative improvement in the SAD metric against a strong baseline with negligible increase of parameters (<0.5%). Compared with the state-of-the-art matting network, we achieve 8% higher performance with only 40% model complexity.