论文标题

与全球后验信息合并的神经网络的概率联合学习

Probabilistic Federated Learning of Neural Networks Incorporated with Global Posterior Information

论文作者

Xiao, Peng, Cheng, Samuel

论文摘要

在联合学习中,对当地客户培训的模型被蒸馏成全球模型。由于神经网络中产生的置换不变性,因此有必要在使用神经网络执行联合学习时首先匹配隐藏的神经元。通过贝叶斯非参数框架,概率联合神经匹配(PFNM)匹配并融合了局部神经网络,以适应不同的全局模型大小和数据的异质性。在本文中,我们提出了一种新方法,该方法通过对神经成分产品的kullback-leibler(kl)差异扩展PFNM,以便在本地和全球水平的后验中进行推断利用后验信息。从理论上讲,我们还可以表明,可以将附加零件无缝连接到比赛和配合的进度中。通过一系列模拟,这表明我们的新方法在单个通信回合和其他交流循环中都优于流行的最新联合学习方法。

In federated learning, models trained on local clients are distilled into a global model. Due to the permutation invariance arises in neural networks, it is necessary to match the hidden neurons first when executing federated learning with neural networks. Through the Bayesian nonparametric framework, Probabilistic Federated Neural Matching (PFNM) matches and fuses local neural networks so as to adapt to varying global model size and the heterogeneity of the data. In this paper, we propose a new method which extends the PFNM with a Kullback-Leibler (KL) divergence over neural components product, in order to make inference exploiting posterior information in both local and global levels. We also show theoretically that The additional part can be seamlessly concatenated into the match-and-fuse progress. Through a series of simulations, it indicates that our new method outperforms popular state-of-the-art federated learning methods in both single communication round and additional communication rounds situation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源