论文标题

在我的知识增强语言模型中,什么得到了增强?

What Has Been Enhanced in my Knowledge-Enhanced Language Model?

论文作者

Hou, Yifan, Fu, Guoji, Sachan, Mrinmaya

论文摘要

审慎的语言模型(LMS)不能很好地捕捉事实知识。这导致了许多知识整合(KI)方法的发展,这些方法旨在将外部知识纳入预验证的LMS中。即使Ki方法显示出对香草LMS的表现有所提高,但这些方法的内部工作并不理解。例如,目前尚不清楚如何有效地整合到这些模型中,并且这种整合是否可能导致灾难性忘记已经学到的知识。本文以信息理论观点重新审视了这些模型中的Ki过程,并表明可以使用图卷积操作来解释Ki。我们提出了一个称为\ textIt {Graph卷积模拟器}(GCS)的探针模型,用于解释知识增强的LMS并揭示这些模型中哪种知识的内容。我们进行实验,以验证我们的GC确实可以用来正确解释Ki过程,并使用它来分析两个知名知识增强的LMS:Ernie和K-Adapter,并发现其中只有少量的事实知识才整合在其中。我们根据各种关系类型对知识进行分层,发现Ernie和K-Adapter在不同程度上整合了不同种类的知识。我们的分析还表明,简单地增加Ki copcus的大小可能不会导致更好的Ki。可能需要基本进步。

Pretrained language models (LMs) do not capture factual knowledge very well. This has led to the development of a number of knowledge integration (KI) methods which aim to incorporate external knowledge into pretrained LMs. Even though KI methods show some performance gains over vanilla LMs, the inner-workings of these methods are not well-understood. For instance, it is unclear how and what kind of knowledge is effectively integrated into these models and if such integration may lead to catastrophic forgetting of already learned knowledge. This paper revisits the KI process in these models with an information-theoretic view and shows that KI can be interpreted using a graph convolution operation. We propose a probe model called \textit{Graph Convolution Simulator} (GCS) for interpreting knowledge-enhanced LMs and exposing what kind of knowledge is integrated into these models. We conduct experiments to verify that our GCS can indeed be used to correctly interpret the KI process, and we use it to analyze two well-known knowledge-enhanced LMs: ERNIE and K-Adapter, and find that only a small amount of factual knowledge is integrated in them. We stratify knowledge in terms of various relation types and find that ERNIE and K-Adapter integrate different kinds of knowledge to different extent. Our analysis also shows that simply increasing the size of the KI corpus may not lead to better KI; fundamental advances may be needed.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源