论文标题

一般的增量学习,具有域名的分类表示

General Incremental Learning with Domain-aware Categorical Representations

论文作者

Xie, Jiangwei, Yan, Shipeng, He, Xuming

论文摘要

持续学习是在现实世界应用中实现人类水平智能的一个重要问题,因为代理必须不断积累知识,以响应流数据/任务。在这项工作中,我们考虑了一个一般但探索的增量学习问题,其中类别分布和特定于类的域分布会随着时间而变化。除了班级增量学习中的典型挑战外,此设置还面临着类内部稳态 - 塑性困境和类内部领域的不平衡问题。为了解决上述问题,我们基于EM框架开发了一种新颖的域名持续学习方法。具体而言,我们使用扩展和还原策略介绍了基于von mises-fisher混合模型的灵活类表示,以捕获阶层的结构,以根据类复杂性动态增加组件数量。此外,我们设计了双层平衡的记忆,以应对班级内部和跨类的数据失衡,这与蒸馏损失相结合,以实现更好的类间和内部稳定性 - 塑性折衷。我们对三个基准测试进行详尽的实验:IDIGITS,IDOMAINNET和ICIFAR-20。结果表明,我们的方法始终优于先前的方法,这表明了它的优势。

Continual learning is an important problem for achieving human-level intelligence in real-world applications as an agent must continuously accumulate knowledge in response to streaming data/tasks. In this work, we consider a general and yet under-explored incremental learning problem in which both the class distribution and class-specific domain distribution change over time. In addition to the typical challenges in class incremental learning, this setting also faces the intra-class stability-plasticity dilemma and intra-class domain imbalance problems. To address above issues, we develop a novel domain-aware continual learning method based on the EM framework. Specifically, we introduce a flexible class representation based on the von Mises-Fisher mixture model to capture the intra-class structure, using an expansion-and-reduction strategy to dynamically increase the number of components according to the class complexity. Moreover, we design a bi-level balanced memory to cope with data imbalances within and across classes, which combines with a distillation loss to achieve better inter- and intra-class stability-plasticity trade-off. We conduct exhaustive experiments on three benchmarks: iDigits, iDomainNet and iCIFAR-20. The results show that our approach consistently outperforms previous methods by a significant margin, demonstrating its superiority.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源