论文标题
Bert微调阿拉伯文本摘要
BERT Fine-tuning For Arabic Text Summarization
论文作者
论文摘要
微调验证的BERT模型是提取/抽象文本摘要的最先进方法,在本文中,我们展示了如何将这种微调方法应用于阿拉伯语,既可以构建第一个记录的阿拉伯文本摘要模型,又在阿拉伯语提取性摘要中显示了其性能。我们的模型与多语言BERT一起使用(因为阿拉伯语没有鉴定的BERT)。我们首先在英语语料库中表现出它的表现,然后再将其应用于阿拉伯语料库中的提取和抽象任务。
Fine-tuning a pretrained BERT model is the state of the art method for extractive/abstractive text summarization, in this paper we showcase how this fine-tuning method can be applied to the Arabic language to both construct the first documented model for abstractive Arabic text summarization and show its performance in Arabic extractive summarization. Our model works with multilingual BERT (as Arabic language does not have a pretrained BERT of its own). We show its performance in English corpus first before applying it to Arabic corpora in both extractive and abstractive tasks.