论文标题
不变图网络的收敛性
Convergence of Invariant Graph Networks
论文作者
论文摘要
尽管最近对理论特性(例如表达能力和图形神经网络(GNN)过度平滑)进行了广泛研究,但其收敛性属性是一个相对较新的方向。在本文中,我们研究了从图形中采样的图形上,研究了一个强大的GNN不变图网络(IGN)的收敛性。 我们首先证明了基于线性equivariant层的新颖解释,我们证明了一般$ k $ -ign(订单$ k $)的线性层的稳定性。在此结果的基础上,我们证明了$ k $ - 符号在\ citet {ruiz2020graphon}的模型下的收敛性,在那里我们访问边缘重量,但对于Graphon Inputs测量了收敛误差。 在\ citet {keriven2020 Convergence}的更自然(更具挑战性的)设置下,其中只能根据边缘概率访问0-1邻接矩阵,我们首先显示出负面结果表明任何IGN的收敛性是不可能的。然后,在边缘概率估计之后,我们获得了IGN子集的收敛,称为IGN-SMALL。我们表明,IGN-SMALL仍然包含足够丰富的功能类别,可以任意地近似光谱GNN。最后,我们在各种Graphon模型上执行实验以验证我们的语句。
Although theoretical properties such as expressive power and over-smoothing of graph neural networks (GNN) have been extensively studied recently, its convergence property is a relatively new direction. In this paper, we investigate the convergence of one powerful GNN, Invariant Graph Network (IGN) over graphs sampled from graphons. We first prove the stability of linear layers for general $k$-IGN (of order $k$) based on a novel interpretation of linear equivariant layers. Building upon this result, we prove the convergence of $k$-IGN under the model of \citet{ruiz2020graphon}, where we access the edge weight but the convergence error is measured for graphon inputs. Under the more natural (and more challenging) setting of \citet{keriven2020convergence} where one can only access 0-1 adjacency matrix sampled according to edge probability, we first show a negative result that the convergence of any IGN is not possible. We then obtain the convergence of a subset of IGNs, denoted as IGN-small, after the edge probability estimation. We show that IGN-small still contains function class rich enough that can approximate spectral GNNs arbitrarily well. Lastly, we perform experiments on various graphon models to verify our statements.