论文标题

从分散数据中深层学习深度学习

Deep Class Incremental Learning from Decentralized Data

论文作者

Zhang, Xiaohan, Dong, Songlin, Chen, Jinjie, Tian, Qi, Gong, Yihong, Hong, Xiaopeng

论文摘要

在本文中,我们专注于一个新的且具有挑战性的分散机器学习范式,其中有许多要解决的数据流入,并且数据存储在多个存储库中。我们通过做出以下贡献来启动数据分散的类学习(DCIL)的研究。首先,我们制定了DCIL问题并制定实验方案。其次,我们引入了一个范式,以创建典型(集中的)课堂学习方法的基本分散对应物,因此为DCIL研究建立了基准。第三,我们进一步提出了一个分散的综合知识增量蒸馏框架(DCID),将知识从历史模型和多个本地站点转移到一般模型。 DCID由三个主要组成部分组成,即本地班级知识学习,本地模型之间的合作知识蒸馏以及从本地模型到一般模型的汇总知识蒸馏。我们通过使用三个组件的不同实现来全面研究DCID框架。广泛的实验结果证明了我们的DCID框架的有效性。基线方法和提议的DCIL的代码将在https://github.com/zxxxxxh/dcil上发布。

In this paper, we focus on a new and challenging decentralized machine learning paradigm in which there are continuous inflows of data to be addressed and the data are stored in multiple repositories. We initiate the study of data decentralized class-incremental learning (DCIL) by making the following contributions. Firstly, we formulate the DCIL problem and develop the experimental protocol. Secondly, we introduce a paradigm to create a basic decentralized counterpart of typical (centralized) class-incremental learning approaches, and as a result, establish a benchmark for the DCIL study. Thirdly, we further propose a Decentralized Composite knowledge Incremental Distillation framework (DCID) to transfer knowledge from historical models and multiple local sites to the general model continually. DCID consists of three main components namely local class-incremental learning, collaborated knowledge distillation among local models, and aggregated knowledge distillation from local models to the general one. We comprehensively investigate our DCID framework by using different implementations of the three components. Extensive experimental results demonstrate the effectiveness of our DCID framework. The codes of the baseline methods and the proposed DCIL will be released at https://github.com/zxxxxh/DCIL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源