论文标题

神经因子:具有深层神经网络的单形象人体修饰

NeuralReshaper: Single-image Human-body Retouching with Deep Neural Networks

论文作者

Chen, Beijia, Shen, Yuefan, Fu, Hongbo, Chen, Xiang, Zhou, Kun, Zheng, Youyi

论文摘要

在本文中,我们提出了神经酶,这是一种使用深层生成网络中人体语义重塑人体语义重塑的新方法。为了实现全球连贯的重塑效果,我们的方法遵循了一个拟合的逆流管道,该管道首先将参数的3D人类模型拟合到源人类图像,然后根据用户指定的语义属性重塑了拟合的3D模型。以前的方法依靠图像扭曲将3D重塑效果传递到整个图像域,因此通常会在前景和背景中引起扭曲。相比之下,我们诉诸于以源图像为条件的生成对抗网和由重塑3D模型引起的2D翘曲场,以实现更现实的重塑结果。具体而言,我们使用两头类似于UNET的发电机分别编码源图像中的前景和背景信息,并通过特征空间扭曲指导从前景分支到背景分支的信息流。此外,为了处理不存在配对数据的缺乏数据(即,形状不同的人体),我们引入了一种新颖的自我监督策略来训练我们的网络。与以前的方法通常需要手动努力以纠正由不正确的身体到图像拟合引起的不良伪像,我们的方法是完全自动的。室内和室外数据集的广泛实验证明了我们方法比以前的方法的优越性。

In this paper, we present NeuralReshaper, a novel method for semantic reshaping of human bodies in single images using deep generative networks. To achieve globally coherent reshaping effects, our approach follows a fit-then-reshape pipeline, which first fits a parametric 3D human model to a source human image and then reshapes the fitted 3D model with respect to user-specified semantic attributes. Previous methods rely on image warping to transfer 3D reshaping effects to the entire image domain and thus often cause distortions in both foreground and background. In contrast, we resort to generative adversarial nets conditioned on the source image and a 2D warping field induced by the reshaped 3D model, to achieve more realistic reshaping results. Specifically, we separately encode the foreground and background information in the source image using a two-headed UNet-like generator, and guide the information flow from the foreground branch to the background branch via feature space warping. Furthermore, to deal with the lack-of-data problem that no paired data exist (i.e., the same human bodies in varying shapes), we introduce a novel self-supervised strategy to train our network. Unlike previous methods that often require manual efforts to correct undesirable artifacts caused by incorrect body-to-image fitting, our method is fully automatic. Extensive experiments on both indoor and outdoor datasets demonstrate the superiority of our method over previous approaches.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源