论文标题

元转移以脱张

Meta Transferring for Deblurring

论文作者

Liu, Po-Sheng, Tsai, Fu-Jen, Peng, Yan-Tsung, Tsai, Chung-Chi, Lin, Chia-Wen, Lin, Yen-Yu

论文摘要

以前的大多数脱蓝色方法都是使用仿制模型培训的,该模型训练了模糊的图像及其尖锐的对应物。但是,由于训练和测试集之间的域间隙,这些方法可能具有亚最佳的脱毛结果。本文提出了一个重新布鲁布尔元转移方案,以实现测试时间的适应,而无需使用地面真相来动态场景。由于在实际情况下的推理时,地面真相通常是不可用的,因此我们利用模糊的输入视频来查找和使用相对鲜明的斑点作为伪地面真相。此外,我们提出了一个翻新模型,以从模糊输入中提取同质模糊,然后将其转移到伪sharps中,以获得相应的伪毛斑块,以进行元学习和测试时间适应,并仅使用少量梯度更新。广泛的实验结果表明,我们的Reblur-Deblur元学习方案可以改善DVD,REDS和REALBLUR基准数据集上的最新deblurring模型。

Most previous deblurring methods were built with a generic model trained on blurred images and their sharp counterparts. However, these approaches might have sub-optimal deblurring results due to the domain gap between the training and test sets. This paper proposes a reblur-deblur meta-transferring scheme to realize test-time adaptation without using ground truth for dynamic scene deblurring. Since the ground truth is usually unavailable at inference time in a real-world scenario, we leverage the blurred input video to find and use relatively sharp patches as the pseudo ground truth. Furthermore, we propose a reblurring model to extract the homogenous blur from the blurred input and transfer it to the pseudo-sharps to obtain the corresponding pseudo-blurred patches for meta-learning and test-time adaptation with only a few gradient updates. Extensive experimental results show that our reblur-deblur meta-learning scheme can improve state-of-the-art deblurring models on the DVD, REDS, and RealBlur benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源