论文标题

FEDBE:制作适用于联邦学习的贝叶斯模型合奏

FedBE: Making Bayesian Model Ensemble Applicable to Federated Learning

论文作者

Chen, Hong-You, Chao, Wei-Lun

论文摘要

联邦学习旨在通过访问用户的本地训练模型而不是自己的数据来协作训练强大的全球模型。因此,关键的步骤是将本地模型汇总到一个全球模型中,当用户拥有非I.I.D时,该模型已表现出挑战。数据。在本文中,我们提出了一种名为FedBe的新颖的聚合算法,该算法通过对高质量的全球模型进行采样并通过贝叶斯模型集合组合来采取贝叶斯推论的观点,从而导致了非常强大的聚合。我们表明,可以通过简单地将高斯或Dirichlet分布拟合到本地模型来构建有效的模型分布。我们的实证研究验证了FedBe的出色表现,尤其是当用户数据不是I.I.D.当神经网络变得更深时。此外,FEDBE与最近在用户的模型培训正规化的努力兼容,使其成为一个易于适用的模块:您只需要替换聚合方法,而是将联合学习算法的其他部分完好无损。我们的代码可在https://github.com/hongyouc/fedbe上公开获取。

Federated learning aims to collaboratively train a strong global model by accessing users' locally trained models but not their own data. A crucial step is therefore to aggregate local models into a global model, which has been shown challenging when users have non-i.i.d. data. In this paper, we propose a novel aggregation algorithm named FedBE, which takes a Bayesian inference perspective by sampling higher-quality global models and combining them via Bayesian model Ensemble, leading to much robust aggregation. We show that an effective model distribution can be constructed by simply fitting a Gaussian or Dirichlet distribution to the local models. Our empirical studies validate FedBE's superior performance, especially when users' data are not i.i.d. and when the neural networks go deeper. Moreover, FedBE is compatible with recent efforts in regularizing users' model training, making it an easily applicable module: you only need to replace the aggregation method but leave other parts of your federated learning algorithm intact. Our code is publicly available at https://github.com/hongyouc/FedBE.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源