论文标题
UNIKD:神经隐式表示的不确定性过滤的增量知识蒸馏
UNIKD: UNcertainty-filtered Incremental Knowledge Distillation for Neural Implicit Representation
论文作者
论文摘要
最近的神经隐式表示(NIR)在3D重建和新型观点综合的任务中取得了巨大成功。但是,他们需要从不同的相机视图中的场景图像进行一次性培训。这是昂贵的,尤其是对于具有大型场景和有限数据存储的场景。鉴于此,我们探讨了这项工作中NIR的增量学习任务。我们设计了一个学生教师框架,以减轻灾难性的遗忘问题。具体来说,我们迭代在每个时间步骤结束时使用学生作为老师的过程,并让老师在下一步中指导学生对学生的培训。结果,学生网络能够从流数据中学习新信息,并同时从教师网络中保留旧知识。尽管直观,但天真地应用学生教师管道并不能在我们的任务中效果很好。并非来自教师网络的所有信息都有帮助,因为它仅接受旧数据培训。为了减轻此问题,我们进一步引入了一个随机查询器和一个基于不确定性的过滤器来过滤有用的信息。我们提出的方法是一般的,因此可以适应不同的隐式表示,例如神经辐射场(NERF)和神经表面场。 3D重建和新型观点合成的广泛实验结果表明,与不同的基准相比,我们方法的有效性。
Recent neural implicit representations (NIRs) have achieved great success in the tasks of 3D reconstruction and novel view synthesis. However, they require the images of a scene from different camera views to be available for one-time training. This is expensive especially for scenarios with large-scale scenes and limited data storage. In view of this, we explore the task of incremental learning for NIRs in this work. We design a student-teacher framework to mitigate the catastrophic forgetting problem. Specifically, we iterate the process of using the student as the teacher at the end of each time step and let the teacher guide the training of the student in the next step. As a result, the student network is able to learn new information from the streaming data and retain old knowledge from the teacher network simultaneously. Although intuitive, naively applying the student-teacher pipeline does not work well in our task. Not all information from the teacher network is helpful since it is only trained with the old data. To alleviate this problem, we further introduce a random inquirer and an uncertainty-based filter to filter useful information. Our proposed method is general and thus can be adapted to different implicit representations such as neural radiance field (NeRF) and neural surface field. Extensive experimental results for both 3D reconstruction and novel view synthesis demonstrate the effectiveness of our approach compared to different baselines.