论文标题
空间:通过关系线性转换嵌入实体空间中的知识图
SpaceE: Knowledge Graph Embedding by Relational Linear Transformation in the Entity Space
论文作者
论文摘要
基于翻译距离的知识图嵌入(KGE)方法,例如transe和旋转,建模知识图中的关系为矢量空间中的翻译或旋转。翻译和旋转都是铸造的。也就是说,不同矢量的翻译或旋转会导致不同的结果。在知识图中,不同的实体可能与相同的实体有关系。例如,许多演员出演了一部电影。这种非注射关系模式不能通过现有基于翻译距离的KGE方法中的翻译或旋转操作来很好地建模。为了应对挑战,我们提出了一种名为SpaceE的基于翻译距离的KGE方法,将其作为线性转换建模。所提出的空间将实体和关系嵌入了知识图中,因为矩阵和空间自然地模拟了与奇异线性变换的非注射关系。从理论上讲,我们证明了SpaceE是一个完全表达的模型,具有推断出多种所需关系模式的能力,包括对称性,偏斜对称,反转,Abelian组成和非亚伯式组成。链接预测数据集的实验结果表明,空间基本上优于以前基于翻译距离的知识图嵌入方法,尤其是在具有许多非注射关系的数据集上。该代码可根据桨式深度学习平台https://www.paddle.org.cn提供。
Translation distance based knowledge graph embedding (KGE) methods, such as TransE and RotatE, model the relation in knowledge graphs as translation or rotation in the vector space. Both translation and rotation are injective; that is, the translation or rotation of different vectors results in different results. In knowledge graphs, different entities may have a relation with the same entity; for example, many actors starred in one movie. Such a non-injective relation pattern cannot be well modeled by the translation or rotation operations in existing translation distance based KGE methods. To tackle the challenge, we propose a translation distance-based KGE method called SpaceE to model relations as linear transformations. The proposed SpaceE embeds both entities and relations in knowledge graphs as matrices and SpaceE naturally models non-injective relations with singular linear transformations. We theoretically demonstrate that SpaceE is a fully expressive model with the ability to infer multiple desired relation patterns, including symmetry, skew-symmetry, inversion, Abelian composition, and non-Abelian composition. Experimental results on link prediction datasets illustrate that SpaceE substantially outperforms many previous translation distance based knowledge graph embedding methods, especially on datasets with many non-injective relations. The code is available based on the PaddlePaddle deep learning platform https://www.paddlepaddle.org.cn.