论文标题

Pel-Bert:协议实体链接的联合模型

PEL-BERT: A Joint Model for Protocol Entity Linking

论文作者

Li, Shoubin, Cui, Wenzao, Liu, Yujiang, Ming, Xuran, Hu, Jun, YuanzheHu, Wang, Qing

论文摘要

预先训练的模型(例如BERT)被广泛用于NLP任务,并经过微调以始终如一地提高各种NLP任务的性能。然而,在我们的协议语料库中训练的微调BERT模型在实体链接(EL)任务上仍然具有较弱的性能。在本文中,我们提出了一个模型,该模型将与RFC域模型的微调语言模型结合在一起。首先,我们将协议知识库设计为协议EL的指南。其次,我们建议一个新颖的模型Pel-Bert,将协议中名称实体的链接到协议知识库中的类别。最后,我们就描述性文本和抽象概念的预训练语言模型的性能进行了全面研究。实验结果表明,我们的模型在带注释的数据集中实现EL中的最新性能,超过所有基准。

Pre-trained models such as BERT are widely used in NLP tasks and are fine-tuned to improve the performance of various NLP tasks consistently. Nevertheless, the fine-tuned BERT model trained on our protocol corpus still has a weak performance on the Entity Linking (EL) task. In this paper, we propose a model that joints a fine-tuned language model with an RFC Domain Model. Firstly, we design a Protocol Knowledge Base as the guideline for protocol EL. Secondly, we propose a novel model, PEL-BERT, to link named entities in protocols to categories in Protocol Knowledge Base. Finally, we conduct a comprehensive study on the performance of pre-trained language models on descriptive texts and abstract concepts. Experimental results demonstrate that our model achieves state-of-the-art performance in EL on our annotated dataset, outperforming all the baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源