论文标题

通过易于增强和对比度学习,改善长时间的文档级别的关系提取

Improving Long Tailed Document-Level Relation Extraction via Easy Relation Augmentation and Contrastive Learning

论文作者

Du, Yangkai, Ma, Tengfei, Wu, Lingfei, Wu, Yiming, Zhang, Xuhong, Long, Bo, Ji, Shouling

论文摘要

对于实际信息提取方案,关系提取的研究正在发展为文档级别的关系提取(DOCRE)。 DOCRE的现有方法旨在通过新型模型体系结构在长篇小说中编码各种信息来源来提高关系。但是,先前的工作忽略了DOCRE的固有长尾分配问题。我们认为,在现实世界中,减轻长尾分配问题对于DOCR至关重要。出于长尾分配问题的激励,我们提出了一种简单的关系增强(ERA)方法,以通过提高尾巴关系的性能来改善DOCRE。此外,我们进一步提出了一个基于我们时代(即ERACL)的新颖对比学习框架,该框架可以进一步提高尾巴关系上的模型性能并实现与艺术品相比的竞争性DOCRE绩效。

Towards real-world information extraction scenario, research of relation extraction is advancing to document-level relation extraction(DocRE). Existing approaches for DocRE aim to extract relation by encoding various information sources in the long context by novel model architectures. However, the inherent long-tailed distribution problem of DocRE is overlooked by prior work. We argue that mitigating the long-tailed distribution problem is crucial for DocRE in the real-world scenario. Motivated by the long-tailed distribution problem, we propose an Easy Relation Augmentation(ERA) method for improving DocRE by enhancing the performance of tailed relations. In addition, we further propose a novel contrastive learning framework based on our ERA, i.e., ERACL, which can further improve the model performance on tailed relations and achieve competitive overall DocRE performance compared to the state-of-arts.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源