论文标题

关于许多多语言步骤的强大的增量学习

On Robust Incremental Learning over Many Multilingual Steps

论文作者

Praharaj, Karan, Matveeva, Irina

论文摘要

渐进学习的最新工作引入了各种方法,以解决从数据增强到优化培训方案的灾难性遗忘。但是,他们中的大多数都集中在很少的培训步骤上。我们提出了一种使用来自各种语言的数据,在数十个微调步骤中进行强大的增量学习方法。我们表明,数据实践和优化的培训制度的结合使我们甚至可以在多达五十个培训步骤中继续改进该模型。至关重要的是,我们的增强策略不需要保留对以前的培训数据的访问,并且适用于具有隐私限制的情况。

Recent work in incremental learning has introduced diverse approaches to tackle catastrophic forgetting from data augmentation to optimized training regimes. However, most of them focus on very few training steps. We propose a method for robust incremental learning over dozens of fine-tuning steps using data from a variety of languages. We show that a combination of data-augmentation and an optimized training regime allows us to continue improving the model even for as many as fifty training steps. Crucially, our augmentation strategy does not require retaining access to previous training data and is suitable in scenarios with privacy constraints.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源