论文标题
几何知识蒸馏:图形神经网络的拓扑压缩
Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks
论文作者
论文摘要
我们研究了一种新的知识转移范式,旨在通过将知识从在完整的图表上训练的教师GNN模型提炼到在较小或稀疏图上运行的学生GNN模型,将图形拓扑信息编码为图形神经网络(GNN)。为此,我们重新审视热力学与GNN的行为之间的联系,基于我们建议神经热核(NHK)封装了基础歧管与GNN结构的几何特性。通过将NHK对准教师和学生模型,将其称为几何知识蒸馏,从而得出了基本和原则性的解决方案。我们开发非和参数的实例化,并在各种实验环境中证明了它们的功效,以了解有关不同类型的特权拓扑信息和教师学生方案的知识蒸馏。
We study a new paradigm of knowledge transfer that aims at encoding graph topological information into graph neural networks (GNNs) by distilling knowledge from a teacher GNN model trained on a complete graph to a student GNN model operating on a smaller or sparser graph. To this end, we revisit the connection between thermodynamics and the behavior of GNN, based on which we propose Neural Heat Kernel (NHK) to encapsulate the geometric property of the underlying manifold concerning the architecture of GNNs. A fundamental and principled solution is derived by aligning NHKs on teacher and student models, dubbed as Geometric Knowledge Distillation. We develop non- and parametric instantiations and demonstrate their efficacy in various experimental settings for knowledge distillation regarding different types of privileged topological information and teacher-student schemes.