论文标题
LMQFORMER:Laplace-Prior引导的面膜查询变压器,用于轻量降雪
LMQFormer: A Laplace-Prior-Guided Mask Query Transformer for Lightweight Snow Removal
论文作者
论文摘要
降雪旨在定位雪地区域并恢复干净的图像,而无需修复轨迹。与雨水的规律性和半平滑不同,具有各种模式和降解的降雪严重阻塞了背景。结果,最先进的降雪方法通常保留较大的参数大小。在本文中,我们提出了一个称为Laplace Mask查询变压器(LMQFormer)的轻巧但高效率高的降雪网络。首先,我们提出一个拉普拉斯五号,以产生粗面膜作为雪的知识。我们旨在减少雪的信息熵和恢复的计算成本,而不是在数据集中使用掩码。其次,我们设计了一个面膜查询变压器(MQFORMER),以用粗面膜去除雪,在那里我们使用两个平行编码器和一个混合解码器来学习在轻量级要求下学习广泛的雪特征。第三,我们开发了一个重复的掩码查询注意(DMQA),该查询注意将粗蒙版转换为特定数量的查询,这限制了MQFormer的注意力区域,并以降低的参数来限制。流行数据集中的实验结果证明了我们提出的模型的效率,该模型可通过明显减少的参数和最低的运行时间来达到最先进的降雪质量。
Snow removal aims to locate snow areas and recover clean images without repairing traces. Unlike the regularity and semitransparency of rain, snow with various patterns and degradations seriously occludes the background. As a result, the state-of-the-art snow removal methods usually retains a large parameter size. In this paper, we propose a lightweight but high-efficient snow removal network called Laplace Mask Query Transformer (LMQFormer). Firstly, we present a Laplace-VQVAE to generate a coarse mask as prior knowledge of snow. Instead of using the mask in dataset, we aim at reducing both the information entropy of snow and the computational cost of recovery. Secondly, we design a Mask Query Transformer (MQFormer) to remove snow with the coarse mask, where we use two parallel encoders and a hybrid decoder to learn extensive snow features under lightweight requirements. Thirdly, we develop a Duplicated Mask Query Attention (DMQA) that converts the coarse mask into a specific number of queries, which constraint the attention areas of MQFormer with reduced parameters. Experimental results in popular datasets have demonstrated the efficiency of our proposed model, which achieves the state-of-the-art snow removal quality with significantly reduced parameters and the lowest running time.