论文标题

KGNN:利用基于内核的网络,用于半监督图分类

KGNN: Harnessing Kernel-based Networks for Semi-supervised Graph Classification

论文作者

Ju, Wei, Yang, Junwei, Qu, Meng, Song, Weiping, Shen, Jianhao, Zhang, Ming

论文摘要

本文研究了半监督的图形分类,这是社交网络分析和生物信息学中各种应用的重要问题。通常通过使用图形神经网络(GNN)来解决此问题,该网络(GNN)依赖于大量标记的图表进行训练,并且无法利用未标记的图形。我们通过提出基于内核的图神经网络(KGNN)来解决局限性。 KGNN由基于GNN的网络以及通过内存网络参数为基础的网络组成。基于GNN的网络通过学习图形表示执行分类,以隐式捕获查询图和标记图之间的相似性,而基于内核的网络使用图形内核将每个查询图与存储在存储器中的所有标记的图表进行比较。这两个网络是从互补的角度来激励的,因此梳理它们使KGNN可以更有效地使用标记的图。我们通过通过后正规化使两个网络共同训练这两个网络在未标记的图表上达成协议,从而使未标记的图成为一个桥梁,使两个网络相互互相增强。在一系列众所周知的基准数据集上进行的实验表明,KGNN在竞争基准方面取得了令人印象深刻的表现。

This paper studies semi-supervised graph classification, which is an important problem with various applications in social network analysis and bioinformatics. This problem is typically solved by using graph neural networks (GNNs), which yet rely on a large number of labeled graphs for training and are unable to leverage unlabeled graphs. We address the limitations by proposing the Kernel-based Graph Neural Network (KGNN). A KGNN consists of a GNN-based network as well as a kernel-based network parameterized by a memory network. The GNN-based network performs classification through learning graph representations to implicitly capture the similarity between query graphs and labeled graphs, while the kernel-based network uses graph kernels to explicitly compare each query graph with all the labeled graphs stored in a memory for prediction. The two networks are motivated from complementary perspectives, and thus combing them allows KGNN to use labeled graphs more effectively. We jointly train the two networks by maximizing their agreement on unlabeled graphs via posterior regularization, so that the unlabeled graphs serve as a bridge to let both networks mutually enhance each other. Experiments on a range of well-known benchmark datasets demonstrate that KGNN achieves impressive performance over competitive baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源