论文标题

半监督节点分类的混合图对比网络

Mixed Graph Contrastive Network for Semi-Supervised Node Classification

论文作者

Yang, Xihong, Wang, Yiqi, Liu, Yue, Wen, Yi, Meng, Lingyuan, Zhou, Sihang, Liu, Xinwang, Zhu, En

论文摘要

近年来,图形神经网络(GNN)在半监督节点分类中实现了有希望的性能。但是,监督不足的问题以及代表崩溃,在很大程度上限制了GNN在该领域的性能。为了减轻半监督场景中节点表示的崩溃,我们提出了一种新型的图形对比学习方法,称为混合图对比度网络(MGCN)。在我们的方法中,我们通过基于插值的增强策略和相关性减少机制来提高潜在嵌入的歧视能力。具体而言,我们首先在潜在空间中进行基于插值的增强,然后迫使预测模型在样本之间进行线性更改。其次,我们使学习的网络能够通过强迫跨视图的相关矩阵近似身份矩阵来分开两个插值扰动视图的样品。通过结合两个设置,我们从丰富的未标记节点和罕见但有价值的标记节点中提取了丰富的监督信息,以进行判别表示学习。与现有最新方法相比,六个数据集的广泛实验结果证明了MGCN的有效性和普遍性。 MGCN的代码可在github上的https://github.com/xihongyang1999/mgcn获得。

Graph Neural Networks (GNNs) have achieved promising performance in semi-supervised node classification in recent years. However, the problem of insufficient supervision, together with representation collapse, largely limits the performance of the GNNs in this field. To alleviate the collapse of node representations in semi-supervised scenario, we propose a novel graph contrastive learning method, termed Mixed Graph Contrastive Network (MGCN). In our method, we improve the discriminative capability of the latent embeddings by an interpolation-based augmentation strategy and a correlation reduction mechanism. Specifically, we first conduct the interpolation-based augmentation in the latent space and then force the prediction model to change linearly between samples. Second, we enable the learned network to tell apart samples across two interpolation-perturbed views through forcing the correlation matrix across views to approximate an identity matrix. By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discriminative representation learning. Extensive experimental results on six datasets demonstrate the effectiveness and the generality of MGCN compared to the existing state-of-the-art methods. The code of MGCN is available at https://github.com/xihongyang1999/MGCN on Github.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源