论文标题

从探测到意图检测和插槽填充

From Disfluency Detection to Intent Detection and Slot Filling

论文作者

Dao, Mai Hoang, Truong, Thinh Hung, Nguyen, Dat Quoc

论文摘要

我们介绍了第一项经验研究,研究了探测的影响对意向检测和插槽填充的下游任务的影响。我们对越南人进行了这项研究 - 一种低资源语言,没有以前的研究,也没有公共数据集可用于检测。首先,我们通过手动添加上下文不满并注释它们来扩展流利的越南意图检测和插槽填充phoatis。然后,我们使用强基础来进行实验,以基于预训练的语言模型,以进行探测和关节意图检测和插槽填充。 We find that: (i) disfluencies produce negative effects on the performances of the downstream intent detection and slot filling tasks, and (ii) in the disfluency context, the pre-trained multilingual language model XLM-R helps produce better intent detection and slot filling performances than the pre-trained monolingual language model PhoBERT, and this is opposite to what generally found in the fluency context.

We present the first empirical study investigating the influence of disfluency detection on downstream tasks of intent detection and slot filling. We perform this study for Vietnamese -- a low-resource language that has no previous study as well as no public dataset available for disfluency detection. First, we extend the fluent Vietnamese intent detection and slot filling dataset PhoATIS by manually adding contextual disfluencies and annotating them. Then, we conduct experiments using strong baselines for disfluency detection and joint intent detection and slot filling, which are based on pre-trained language models. We find that: (i) disfluencies produce negative effects on the performances of the downstream intent detection and slot filling tasks, and (ii) in the disfluency context, the pre-trained multilingual language model XLM-R helps produce better intent detection and slot filling performances than the pre-trained monolingual language model PhoBERT, and this is opposite to what generally found in the fluency context.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源