论文标题

八分音:基于OCTREE的大规模上下文模型用于点云压缩

OctAttention: Octree-Based Large-Scale Contexts Model for Point Cloud Compression

论文作者

Fu, Chunyang, Li, Ge, Song, Rui, Gao, Wei, Liu, Shan

论文摘要

在点云压缩中,足够的上下文对于建模点云分布很重要。但是,在处理稀疏点云时,先前基于体素的方法收集的上下文减少了。为了解决这个问题,我们提出了一个使用OCTREE结构的多上下文深度学习框架,称为八度,这是点云的记忆效率表示。我们的方法通过收集同胞和祖先节点的信息来以一种无损的方式编码Octree符号序列。显然,我们首先用OCTREE代表点云,以减少空间冗余,这对于具有不同分辨率的点云是可靠的。然后,我们设计了一个有条件的熵模型,该模型具有一个大型的接收领域,该模型对兄弟姐妹和祖先的环境进行建模,以利用相邻节点之间的强依赖性,并采用注意力机制来强调上下文中相关的节点。此外,我们在培训和测试过程中引入了面具操作,以在编码时间和性能之间进行权衡。与以前的最先进作品相比,我们的方法在激光雷达基准(例如semantickitti)和对象点云数据集(例如MPEG 8i,mVUB)上获得了10%-35%的BD率增长率,与基于Voxel的基线相比节省了95%的编码时间。该代码可从https://github.com/zb12138/octattention获得。

In point cloud compression, sufficient contexts are significant for modeling the point cloud distribution. However, the contexts gathered by the previous voxel-based methods decrease when handling sparse point clouds. To address this problem, we propose a multiple-contexts deep learning framework called OctAttention employing the octree structure, a memory-efficient representation for point clouds. Our approach encodes octree symbol sequences in a lossless way by gathering the information of sibling and ancestor nodes. Expressly, we first represent point clouds with octree to reduce spatial redundancy, which is robust for point clouds with different resolutions. We then design a conditional entropy model with a large receptive field that models the sibling and ancestor contexts to exploit the strong dependency among the neighboring nodes and employ an attention mechanism to emphasize the correlated nodes in the context. Furthermore, we introduce a mask operation during training and testing to make a trade-off between encoding time and performance. Compared to the previous state-of-the-art works, our approach obtains a 10%-35% BD-Rate gain on the LiDAR benchmark (e.g. SemanticKITTI) and object point cloud dataset (e.g. MPEG 8i, MVUB), and saves 95% coding time compared to the voxel-based baseline. The code is available at https://github.com/zb12138/OctAttention.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源