论文标题

无监督的batchNorm适应(UBNA):一种用于语义分割的域适应方法,而无需使用源域表示

Unsupervised BatchNorm Adaptation (UBNA): A Domain Adaptation Method for Semantic Segmentation Without Using Source Domain Representations

论文作者

Klingner, Marvin, Termöhlen, Jan-Aike, Ritterbach, Jacob, Fingscheidt, Tim

论文摘要

在本文中,我们提出了一种解决方案,以解决给定预训练的语义分割模型的“无监督域适应性(UDA),而无需依赖任何源域表示”。以前的UDA方法用于语义分割的方法要么在源和目标域中同时对模型进行了对模型的培训,要么它们依赖于附加的网络,在适应过程中将源域知识重新播放为模型。相反,我们介绍了新颖的无监督批处理适应(UBNA)方法,该方法将给定的预训练模型适应了未见的目标域而不使用的 - 超出了来自预训练的现有模型参数 - 任何源域表示(数据和网络既不是数据),并且在在线环境中也可以在网上或使用几个未单位的图像中使用的是几乎没有任何目标域名的域名。具体而言,我们使用指数衰减的动量因子部分将归一化层统计量转化为目标域,从而将两个域的统计数据混合在一起。通过对标准UDA基准测试的语义分割评估,我们表明,这比仅使用目标域的统计数据的模型优于模型。与标准的UDA方法相比,我们报告了源域表示的性能和使用之间的权衡。

In this paper we present a solution to the task of "unsupervised domain adaptation (UDA) of a given pre-trained semantic segmentation model without relying on any source domain representations". Previous UDA approaches for semantic segmentation either employed simultaneous training of the model in the source and target domains, or they relied on an additional network, replaying source domain knowledge to the model during adaptation. In contrast, we present our novel Unsupervised BatchNorm Adaptation (UBNA) method, which adapts a given pre-trained model to an unseen target domain without using -- beyond the existing model parameters from pre-training -- any source domain representations (neither data, nor networks) and which can also be applied in an online setting or using just a few unlabeled images from the target domain in a few-shot manner. Specifically, we partially adapt the normalization layer statistics to the target domain using an exponentially decaying momentum factor, thereby mixing the statistics from both domains. By evaluation on standard UDA benchmarks for semantic segmentation we show that this is superior to a model without adaptation and to baseline approaches using statistics from the target domain only. Compared to standard UDA approaches we report a trade-off between performance and usage of source domain representations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源