论文标题

零拍的跨语言实体链接中的跨语性转移

Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking

论文作者

Schumacher, Elliot, Mayfield, James, Dredze, Mark

论文摘要

跨语言实体将链接的理由提及多种语言与单语言知识库。我们为此任务提出了一个神经排名架构,该架构使用了神经网络中的多语言BERT表示和上下文。我们发现,BERT的多语言能力在单语和多语言设置中带来了强大的性能。此外,我们探索零击语言转移,并找到出人意料的表现。我们研究了零射的降解,发现它可以通过提出的辅助训练目标来部分缓解,但是剩余的错误最好归因于域移位而不是语言传递。

Cross-language entity linking grounds mentions in multiple languages to a single-language knowledge base. We propose a neural ranking architecture for this task that uses multilingual BERT representations of the mention and the context in a neural network. We find that the multilingual ability of BERT leads to robust performance in monolingual and multilingual settings. Furthermore, we explore zero-shot language transfer and find surprisingly robust performance. We investigate the zero-shot degradation and find that it can be partially mitigated by a proposed auxiliary training objective, but that the remaining error can best be attributed to domain shift rather than language transfer.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源