论文标题

使用多语言bert的零射击跨语性转移的特征聚合

Feature Aggregation in Zero-Shot Cross-Lingual Transfer Using Multilingual BERT

论文作者

Chen, Beiduo, Guo, Wu, Liu, Quan, Tao, Kun

论文摘要

多语言Bert(Mbert)是一种在大型多语言语料库中进行训练的语言模型,具有令人印象深刻的零射击交叉传输功能,并且在零摄像的POS标签和命名实体识别(NER)以及交叉语言模型转移方面表现出色。目前,解决跨语言下游任务的主流方法始终将最后一个变压器层的输出作为语言信息的表示。在这项工作中,我们探讨了下层与Mbert最后一个变压器层的互补特性。提出了基于注意机制的特征聚合模块,以融合Mbert不同层中包含的信息。实验是在四个零射击的跨语性传输数据集上进行的,并且提出的方法在关键的多语言基准任务上获得了性能改进XNLI(+1.5%),PAWS-X(+2.4%),NER(+1.2 F1)和POS(+1.5 f1)。通过对实验结果的分析,我们证明了Mbert最后一层之前的层可以为跨语言下游任务提供额外的有用信息,并探索Mbert的可解释性。

Multilingual BERT (mBERT), a language model pre-trained on large multilingual corpora, has impressive zero-shot cross-lingual transfer capabilities and performs surprisingly well on zero-shot POS tagging and Named Entity Recognition (NER), as well as on cross-lingual model transfer. At present, the mainstream methods to solve the cross-lingual downstream tasks are always using the last transformer layer's output of mBERT as the representation of linguistic information. In this work, we explore the complementary property of lower layers to the last transformer layer of mBERT. A feature aggregation module based on an attention mechanism is proposed to fuse the information contained in different layers of mBERT. The experiments are conducted on four zero-shot cross-lingual transfer datasets, and the proposed method obtains performance improvements on key multilingual benchmark tasks XNLI (+1.5 %), PAWS-X (+2.4 %), NER (+1.2 F1), and POS (+1.5 F1). Through the analysis of the experimental results, we prove that the layers before the last layer of mBERT can provide extra useful information for cross-lingual downstream tasks and explore the interpretability of mBERT empirically.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源