论文标题
多任务时间序列分类的有效联合蒸馏学习系统
An Efficient Federated Distillation Learning System for Multi-task Time Series Classification
论文作者
论文摘要
本文提出了用于多任务时间序列分类(TSC)的有效联合蒸馏学习系统(EFDLS)。 EFDL由中央服务器和多个移动用户组成,不同的用户可以在其中运行不同的TSC任务。 EFDLS具有两个新型组件,即基于功能的学生教师(FBST)框架和一个基于距离的权重匹配(DBWM)方案。在每个用户中,FBST框架通过知识蒸馏将知识从其教师的隐藏层转移到其学生的隐藏层,而教师和学生具有相同的网络结构。对于每个连接的用户,其学生模型的隐藏层的权重定期上传到EFDLS服务器。 DBWM方案部署在服务器上,最小平方距离用于测量两个给定型号的权重之间的相似性。该方案为每个连接的用户找到一个合作伙伴,以使用户的权重是上传所有权重中最接近的。服务器交换并将用户及其合作伙伴的权重发送给这两个用户,然后将收到的权重加载到其教师的隐藏层。实验结果表明,所提出的EFDL在一组有关TOP-1准确性的UCR2018数据集中取得了出色的性能。
This paper proposes an efficient federated distillation learning system (EFDLS) for multi-task time series classification (TSC). EFDLS consists of a central server and multiple mobile users, where different users may run different TSC tasks. EFDLS has two novel components, namely a feature-based student-teacher (FBST) framework and a distance-based weights matching (DBWM) scheme. Within each user, the FBST framework transfers knowledge from its teacher's hidden layers to its student's hidden layers via knowledge distillation, with the teacher and student having identical network structure. For each connected user, its student model's hidden layers' weights are uploaded to the EFDLS server periodically. The DBWM scheme is deployed on the server, with the least square distance used to measure the similarity between the weights of two given models. This scheme finds a partner for each connected user such that the user's and its partner's weights are the closest among all the weights uploaded. The server exchanges and sends back the user's and its partner's weights to these two users which then load the received weights to their teachers' hidden layers. Experimental results show that the proposed EFDLS achieves excellent performance on a set of selected UCR2018 datasets regarding top-1 accuracy.