论文标题
语言模型就是您所需要的:自然语言理解作为问答
Language Model is All You Need: Natural Language Understanding as Question Answering
论文作者
论文摘要
转移学习的不同口味已显示出对机器学习的研究和应用的巨大影响。在这项工作中,我们研究了特定转移学习家族的使用,其中目标域被映射到源域。具体来说,我们将自然语言理解(NLU)问题映射到质疑(QA)问题,并且我们表明,与其他NLU的其他方法相比,在低数据制度中,这种方法可取得重大改进。此外,我们表明,可以通过从不同领域跨NLU问题进行连续转移学习来增加这些收益。我们表明,我们的方法可以将相同性能的所需数据量减少多达10倍。
Different flavors of transfer learning have shown tremendous impact in advancing research and applications of machine learning. In this work we study the use of a specific family of transfer learning, where the target domain is mapped to the source domain. Specifically we map Natural Language Understanding (NLU) problems to QuestionAnswering (QA) problems and we show that in low data regimes this approach offers significant improvements compared to other approaches to NLU. Moreover we show that these gains could be increased through sequential transfer learning across NLU problems from different domains. We show that our approach could reduce the amount of required data for the same performance by up to a factor of 10.