论文标题

BIHAND:与多阶段一分配的沙漏网络恢复手架

BiHand: Recovering Hand Mesh with Multi-stage Bisected Hourglass Networks

论文作者

Yang, Lixin, Li, Jiasen, Xu, Wenqiang, Diao, Yiqun, Lu, Cewu

论文摘要

3D手估计一直是计算机视觉中的长期研究主题。最近的趋势不仅旨在估算3D手关节位置,还旨在恢复网格模型。但是,从单个RGB图像中实现这些目标仍然具有挑战性。在本文中,我们介绍了一个端到端可学习的模型Bihand,该模型由三个级联阶段,即2D播种阶段,3D起重阶段和网状生成阶段组成。在Bihand的输出时,将使用网络预测的关节旋转和形状参数回收全手网。在每个阶段,Bihand都采用了一种新颖的一分为分配设计,该设计允许网络封装两个密切相关的信息(例如,在2D播种阶段,3D接头,3D关节和深度图中的3D升降阶段,关节旋转和形状参数的深度图中的两个密切相关的信息(例如2D关节和剪影)。由于信息代表不同的几何形状或结构细节,因此数据流可以促进优化并增加鲁棒性。为了进行定量评估,我们对两个公共基准测试进行实验,即渲染手数据集(RHD)和立体声手动姿势跟踪基准(STB)。广泛的实验表明,与最先进的方法相比,我们的模型可以实现卓越的准确性,并且可以在几种严重的条件下产生吸引人的3D手网。

3D hand estimation has been a long-standing research topic in computer vision. A recent trend aims not only to estimate the 3D hand joint locations but also to recover the mesh model. However, achieving those goals from a single RGB image remains challenging. In this paper, we introduce an end-to-end learnable model, BiHand, which consists of three cascaded stages, namely 2D seeding stage, 3D lifting stage, and mesh generation stage. At the output of BiHand, the full hand mesh will be recovered using the joint rotations and shape parameters predicted from the network. Inside each stage, BiHand adopts a novel bisecting design which allows the networks to encapsulate two closely related information (e.g. 2D keypoints and silhouette in 2D seeding stage, 3D joints, and depth map in 3D lifting stage, joint rotations and shape parameters in the mesh generation stage) in a single forward pass. As the information represents different geometry or structure details, bisecting the data flow can facilitate optimization and increase robustness. For quantitative evaluation, we conduct experiments on two public benchmarks, namely the Rendered Hand Dataset (RHD) and the Stereo Hand Pose Tracking Benchmark (STB). Extensive experiments show that our model can achieve superior accuracy in comparison with state-of-the-art methods, and can produce appealing 3D hand meshes in several severe conditions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源