论文标题
简单的注意网络
Simplicial Attention Networks
论文作者
论文摘要
图表示学习方法主要仅限于节点相互作用的建模。最近,人们对了解如何利用高阶结构来进一步增强了组合空间中图神经网络(GNN)的学习能力的兴趣增加。简单的神经网络(SNN)自然地对这些相互作用进行了对这些相互作用的建模,该信息传递了简单复合物,图表的高维概括。但是,大多数存在的SNN执行的计算严格与复合物的组合结构息息相关。利用结构化域中注意机制的成功,我们提出了简单的注意网络(SAT),这是一种新型的简单网络,动态权衡相邻简单之间的相互作用,并且可以轻松适应新的结构。此外,我们提出了一种签名的注意机制,该机制使SAT取向均衡,这是在(CO)链复合物上运行的模型的理想特性。我们证明,在两个图像和轨迹分类任务中,SAT优于衡量的卷积SNN和GNN。
Graph representation learning methods have mostly been limited to the modelling of node-wise interactions. Recently, there has been an increased interest in understanding how higher-order structures can be utilised to further enhance the learning abilities of graph neural networks (GNNs) in combinatorial spaces. Simplicial Neural Networks (SNNs) naturally model these interactions by performing message passing on simplicial complexes, higher-dimensional generalisations of graphs. Nonetheless, the computations performed by most existent SNNs are strictly tied to the combinatorial structure of the complex. Leveraging the success of attention mechanisms in structured domains, we propose Simplicial Attention Networks (SAT), a new type of simplicial network that dynamically weighs the interactions between neighbouring simplicies and can readily adapt to novel structures. Additionally, we propose a signed attention mechanism that makes SAT orientation equivariant, a desirable property for models operating on (co)chain complexes. We demonstrate that SAT outperforms existent convolutional SNNs and GNNs in two image and trajectory classification tasks.