论文标题

PFL-MOE:基于专家混合的个性化联合学习

PFL-MoE: Personalized Federated Learning Based on Mixture of Experts

论文作者

Guo, Binbin, Mei, Yuan, Xiao, Danyang, Wu, Weigang, Yin, Ye, Chang, Hongli

论文摘要

联合学习(FL)是一种新兴的分布式机器学习范式,避免了培训节点之间的数据共享以保护数据隐私。在FL服务器的协调下,每个客户端都使用自己的计算资源和私人数据集进行模型培训。可以通过汇总客户的培训结果来创建全球模型。为了应对高度非IID数据分布,已经提出了个性化联合学习(PFL),以通过允许每个客户学习个性化模型来提高整体绩效。但是,个性化模型的一个主要缺点是失去概括。为了在维持概括的同时实现模型个性化,我们提出了一种名为PFL-MOE的新方法,该方法通过MOE体系结构将个性化模型和全球模型的输出混合在一起。 PFL-MOE是一种通用方法,可以通过集成现有的PFL算法来实例化。特别是,我们提出的PFL-MF算法是基于Freeze-Base PFL算法的PFL-MOE实例。我们通过增强MOE门控网络的决策能力并提出一种变体算法PFL-MFE来进一步改善PFL-MF。我们通过训练具有非IID分区的时尚摄影和CIFAR-10数据集上的LENET-5和VGG-16模型来证明PFL-MOE的有效性。

Federated learning (FL) is an emerging distributed machine learning paradigm that avoids data sharing among training nodes so as to protect data privacy. Under coordination of the FL server, each client conducts model training using its own computing resource and private data set. The global model can be created by aggregating the training results of clients. To cope with highly non-IID data distributions, personalized federated learning (PFL) has been proposed to improve overall performance by allowing each client to learn a personalized model. However, one major drawback of a personalized model is the loss of generalization. To achieve model personalization while maintaining generalization, in this paper, we propose a new approach, named PFL-MoE, which mixes outputs of the personalized model and global model via the MoE architecture. PFL-MoE is a generic approach and can be instantiated by integrating existing PFL algorithms. Particularly, we propose the PFL-MF algorithm which is an instance of PFL-MoE based on the freeze-base PFL algorithm. We further improve PFL-MF by enhancing the decision-making ability of MoE gating network and propose a variant algorithm PFL-MFE. We demonstrate the effectiveness of PFL-MoE by training the LeNet-5 and VGG-16 models on the Fashion-MNIST and CIFAR-10 datasets with non-IID partitions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源