论文标题

多跳问题回答所需的图结构吗?

Is Graph Structure Necessary for Multi-hop Question Answering?

论文作者

Shao, Nan, Cui, Yiming, Liu, Ting, Wang, Shijin, Hu, Guoping

论文摘要

最近,试图将文本建模为图形结构并引入图形神经网络来处理它已成为许多NLP研究领域的趋势。在本文中,我们研究了图结构是否需要用于多跳问题的回答。我们的分析以HotPotQA为中心。我们构建了一个强大的基线模型,以确定在正确使用预训练的模型的情况下,对于多跳问题回答可能不是必需的图形结构。我们指出,图形结构和邻接矩阵都是与任务相关的先验知识,并且图形意见可以视为自我注意的一种特殊情况。实验和可视化分析表明,图形注意事项或整个图形结构可以用自我注意事项或变压器代替。

Recently, attempting to model texts as graph structure and introducing graph neural networks to deal with it has become a trend in many NLP research areas. In this paper, we investigate whether the graph structure is necessary for multi-hop question answering. Our analysis is centered on HotpotQA. We construct a strong baseline model to establish that, with the proper use of pre-trained models, graph structure may not be necessary for multi-hop question answering. We point out that both graph structure and adjacency matrix are task-related prior knowledge, and graph-attention can be considered as a special case of self-attention. Experiments and visualized analysis demonstrate that graph-attention or the entire graph structure can be replaced by self-attention or Transformers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源