论文标题

定义生成的细粒度对比度学习

Fine-grained Contrastive Learning for Definition Generation

论文作者

Zhang, Hengyuan, Li, Dawei, Yang, Shiping, Li, Yanran

论文摘要

最近,预先训练的基于变压器的模型在定义生成任务(DG)方面取得了巨大成功。但是,以前的编码器模型缺乏有效的表示学习,无法包含给定单词的完整语义组成部分,从而导致产生特定的定义。为了解决这个问题,我们提出了一种新颖的对比学习方法,鼓励模型从定义序列编码中捕获更详细的语义表示。根据自动和手动评估,三个主流基准的实验结果表明,与几种最先进的模型相比,所提出的方法可以生成更具体和高质量的定义。

Recently, pre-trained transformer-based models have achieved great success in the task of definition generation (DG). However, previous encoder-decoder models lack effective representation learning to contain full semantic components of the given word, which leads to generating under-specific definitions. To address this problem, we propose a novel contrastive learning method, encouraging the model to capture more detailed semantic representations from the definition sequence encoding. According to both automatic and manual evaluation, the experimental results on three mainstream benchmarks demonstrate that the proposed method could generate more specific and high-quality definitions compared with several state-of-the-art models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源