论文标题

p $^2 $ - gan:使用单样式图像的高效样式转移

P$^2$-GAN: Efficient Style Transfer Using Single Style Image

论文作者

Zheng, Zhentan, Liu, Jianyi

论文摘要

样式转移是一种有用的图像合成技术,可以在保留其内容信息的同时将图像重新呈现为另一种艺术风格。生成对抗网络(GAN)是针对此任务的广泛采用的框架,其在本地样式模式上的表现能力比基于革兰氏集的传统方法更好。但是,大多数以前的方法都依靠足够数量的预收集样式图像来训练模型。在本文中,提出了一个新颖的补丁置换gan(p $^2 $ gan)网络,该网络可以从单个样式图像中有效地学习中风样式。我们使用补丁置换量来生成来自给定样式图像的多个培训样本。可以设计一个可以同时处理贴片图像和自然图像的贴片判别器。我们还提出了一个基于本地纹理描述的标准,以定量评估样式转移质量。实验结果表明,与许多最先进的方法相比,我们的方法可以从单样式图像产生更优质的重新效果,并提高了计算效率。

Style transfer is a useful image synthesis technique that can re-render given image into another artistic style while preserving its content information. Generative Adversarial Network (GAN) is a widely adopted framework toward this task for its better representation ability on local style patterns than the traditional Gram-matrix based methods. However, most previous methods rely on sufficient amount of pre-collected style images to train the model. In this paper, a novel Patch Permutation GAN (P$^2$-GAN) network that can efficiently learn the stroke style from a single style image is proposed. We use patch permutation to generate multiple training samples from the given style image. A patch discriminator that can simultaneously process patch-wise images and natural images seamlessly is designed. We also propose a local texture descriptor based criterion to quantitatively evaluate the style transfer quality. Experimental results showed that our method can produce finer quality re-renderings from single style image with improved computational efficiency compared with many state-of-the-arts methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源