论文标题

流行网络:从深度图像中估计多人3D姿势姿势的姿势网络姿势

PoP-Net: Pose over Parts Network for Multi-Person 3D Pose Estimation from a Depth Image

论文作者

Guo, Yuliang, Li, Zhong, Li, Zekun, Du, Xiangyu, Quan, Shuxue, Xu, Yi

论文摘要

在本文中,提出了一种称为POP-NET的实时方法,以预测从深度图像中的多人3D姿势。 Pop-Net学会了一次镜头预测自下而上的零件表示和自上而下的全球姿势。具体而言,引入了一种新的零件级表示,称为截断零件位移字段(TPDF),该零件表示可以统一自下而上零件检测和全局姿势检测的优势。同时,引入了有效的模式选择方案,以自动解决全球姿势和部分检测之间的冲突案例。最后,由于缺乏用于开发多人3D姿势估计的高质量深度数据集,我们将多人3D人姿势数据集(MP-3DHP)作为新的基准进行了新的基准。 MP-3DHP旨在在模型培训中启用有效的多人和背景数据增强,并在不受控制的多人场景下评估3D人类姿势估计器。我们表明,Pop-Net在MP-3DHP和广泛使用的ITOP数据集上都可以实现最新的结果,并且在多人处理的效率方面具有显着优势。为了证明我们的算法管道的应用之一,我们还显示了由计算出的3D关节位置驱动的虚拟头像的结果。 MP-3DHP数据集和评估代码已在以下网址提供:https://github.com/oppo-us-research/pop-net。

In this paper, a real-time method called PoP-Net is proposed to predict multi-person 3D poses from a depth image. PoP-Net learns to predict bottom-up part representations and top-down global poses in a single shot. Specifically, a new part-level representation, called Truncated Part Displacement Field (TPDF), is introduced which enables an explicit fusion process to unify the advantages of bottom-up part detection and global pose detection. Meanwhile, an effective mode selection scheme is introduced to automatically resolve the conflicting cases between global pose and part detections. Finally, due to the lack of high-quality depth datasets for developing multi-person 3D pose estimation, we introduce Multi-Person 3D Human Pose Dataset (MP-3DHP) as a new benchmark. MP-3DHP is designed to enable effective multi-person and background data augmentation in model training, and to evaluate 3D human pose estimators under uncontrolled multi-person scenarios. We show that PoP-Net achieves the state-of-the-art results both on MP-3DHP and on the widely used ITOP dataset, and has significant advantages in efficiency for multi-person processing. To demonstrate one of the applications of our algorithm pipeline, we also show results of virtual avatars driven by our calculated 3D joint positions. MP-3DHP Dataset and the evaluation code have been made available at: https://github.com/oppo-us-research/PoP-Net.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源