论文标题
ACIL:具有绝对记忆和隐私保护的分析课程学习
ACIL: Analytic Class-Incremental Learning with Absolute Memorization and Privacy Protection
论文作者
论文摘要
班级学习学习(CIL)通过逐渐出现的不同类别的培训数据学习分类模型。现有的CIL要么由于灾难性遗忘而遭受严重的准确性损失,要么通过重新访问使用的示例来侵犯数据隐私。受线性学习公式的启发,我们提出了一个分析性课程学习(ACIL),并在避免违反数据隐私的同时(即,不存储历史数据),以绝对记住过去的知识。绝对的记忆是从阶级学习使用ACIL给定数据的意义上证明的,从其联合学习的对应物中给出了相同的结果,而其联合学习的同时也消耗了当前样本和历史样本。理论上验证了这种平等。由于在学习过程中不涉及历史数据,因此可以确保数据隐私。经验验证证明了ACIL的竞争精度性能,并获得了各种增量任务设置(例如5-50阶段)的几乎相同的结果。这也使ACIL胜过大型场景的最新方法(例如25阶段和50阶段)。
Class-incremental learning (CIL) learns a classification model with training data of different classes arising progressively. Existing CIL either suffers from serious accuracy loss due to catastrophic forgetting, or invades data privacy by revisiting used exemplars. Inspired by linear learning formulations, we propose an analytic class-incremental learning (ACIL) with absolute memorization of past knowledge while avoiding breaching of data privacy (i.e., without storing historical data). The absolute memorization is demonstrated in the sense that class-incremental learning using ACIL given present data would give identical results to that from its joint-learning counterpart which consumes both present and historical samples. This equality is theoretically validated. Data privacy is ensured since no historical data are involved during the learning process. Empirical validations demonstrate ACIL's competitive accuracy performance with near-identical results for various incremental task settings (e.g., 5-50 phases). This also allows ACIL to outperform the state-of-the-art methods for large-phase scenarios (e.g., 25 and 50 phases).