论文标题
点云质量评估:数据集构建和基于学习的无参考指标
Point Cloud Quality Assessment: Dataset Construction and Learning-based No-Reference Metric
论文作者
论文摘要
近年来,全参考(FR)点云质量评估(PCQA)取得了令人印象深刻的进步。但是,在许多情况下,获得参考点云很困难,因此无参考(NR)指标已成为研究热点。由于缺乏大规模的PCQA数据集,很少有关于NR-PCQA的研究。在本文中,我们首先构建了一个名为LS-PCQA的大规模PCQA数据集,其中包括104个参考点云和22,000多个扭曲的样本。在数据集中,每个参考点云都在7个失真级别下增加了31种类型的损伤(例如高斯噪声,对比损坏,局部缺失和压缩损失)。此外,每个扭曲的点云都以伪质量得分为平均分数(MOS)的替代品。受到层次感知系统的启发,并考虑了点云的内在属性,我们提出了基于稀疏卷积神经网络(CNN)的NR公制RESSCNN,以准确估计点云的主观质量。我们进行了几项实验,以评估所提出的NR度量的性能。结果表明,RESSCNN在所有现有的NR-PCQA指标之间表现出最先进的(SOTA)性能,甚至表现出胜过一些FR指标。这项工作中介绍的数据集将在http://smt.sjtu.edu.cn上公开访问。可以在https://github.com/lyp22/resscnn上找到提议的RESSCNN的源代码。
Full-reference (FR) point cloud quality assessment (PCQA) has achieved impressive progress in recent years. However, in many cases, obtaining the reference point clouds is difficult, so no-reference (NR) metrics have become a research hotspot. Few researches about NR-PCQA are carried out due to the lack of a large-scale PCQA dataset. In this paper, we first build a large-scale PCQA dataset named LS-PCQA, which includes 104 reference point clouds and more than 22,000 distorted samples. In the dataset, each reference point cloud is augmented with 31 types of impairments (e.g., Gaussian noise, contrast distortion, local missing, and compression loss) at 7 distortion levels. Besides, each distorted point cloud is assigned with a pseudo quality score as its substitute of Mean Opinion Score (MOS). Inspired by the hierarchical perception system and considering the intrinsic attributes of point clouds, we propose a NR metric ResSCNN based on sparse convolutional neural network (CNN) to accurately estimate the subjective quality of point clouds. We conduct several experiments to evaluate the performance of the proposed NR metric. The results demonstrate that ResSCNN exhibits the state-of-the-art (SOTA) performance among all the existing NR-PCQA metrics and even outperforms some FR metrics. The dataset presented in this work will be made publicly accessible at http://smt.sjtu.edu.cn. The source code for the proposed ResSCNN can be found at https://github.com/lyp22/ResSCNN.