论文标题
知识三元的信息文本生成
Informative Text Generation from Knowledge Triples
论文作者
论文摘要
随着编码器架构的开发,研究人员能够使用更广泛的数据来研究文本生成任务。其中,KB到文本旨在将一组知识三元转换为人类可读句子。在原始设置中,任务假定输入三元和文本在体现的知识/信息的角度完全对齐。在本文中,我们扩展了此设置,并探讨了如何促进训练的模型以生成更有信息的文本,即包含有关三重实体但未通过输入三元组传达的更多信息。为了解决这个问题,我们提出了一种新型的内存增强发电机,该发电机采用内存网络来记住培训期间学到的有用知识,并利用此类信息以及输入三元组在操作或测试阶段生成文本。我们从WebNLG中得出一个数据集以进行新的环境,并进行广泛的实验以研究我们的模型的有效性以及发现设置的内在特征。
As the development of the encoder-decoder architecture, researchers are able to study the text generation tasks with broader types of data. Among them, KB-to-text aims at converting a set of knowledge triples into human readable sentences. In the original setting, the task assumes that the input triples and the text are exactly aligned in the perspective of the embodied knowledge/information. In this paper, we extend this setting and explore how to facilitate the trained model to generate more informative text, namely, containing more information about the triple entities but not conveyed by the input triples. To solve this problem, we propose a novel memory augmented generator that employs a memory network to memorize the useful knowledge learned during the training and utilizes such information together with the input triples to generate text in the operational or testing phase. We derive a dataset from WebNLG for our new setting and conduct extensive experiments to investigate the effectiveness of our model as well as uncover the intrinsic characteristics of the setting.