论文标题

在多个访问渠道上具有差异隐私约束的高效联合学习

Efficient Federated Learning over Multiple Access Channel with Differential Privacy Constraints

论文作者

Sonee, Amir, Rini, Stefano

论文摘要

在本文中,研究了通过客户和参数服务器之间的数字通信(PS)(MAC)(MAC)的数字通信(PS)的问题,也研究了差异隐私(DP)约束。更确切地说,我们考虑在集中式网络中提示客户使用本地数据集训练机器学习模型的设置。客户和PS之间的信息交换是在MAC频道上进行的,还必须保留本地数据集的DP。因此,客户的目的是最大程度地减少培训损失,但要受到MAC上可靠通信的(I)对本地数据集的DP约束的率约束。对于这种优化方案,我们提出了一种新的共识方案,其中数字分布式随机梯度下降(D-DSGD)由每个客户执行。为了保留DP,用户还向当地量化的梯度添加了数字人工噪声。根据给定的MAC容量,根据收敛速率和DP水平评估该方案的性能。在量化水平和人工噪声参数的选择中,该性能优化。提出了数值评估以验证提出的方案的性能。

In this paper, the problem of federated learning (FL) through digital communication between clients and a parameter server (PS) over a multiple access channel (MAC), also subject to differential privacy (DP) constraints, is studied. More precisely, we consider the setting in which clients in a centralized network are prompted to train a machine learning model using their local datasets. The information exchange between the clients and the PS takes places over a MAC channel and must also preserve the DP of the local datasets. Accordingly, the objective of the clients is to minimize the training loss subject to (i) rate constraints for reliable communication over the MAC and (ii) DP constraint over the local datasets. For this optimization scenario, we proposed a novel consensus scheme in which digital distributed stochastic gradient descent (D-DSGD) is performed by each client. To preserve DP, a digital artificial noise is also added by the users to the locally quantized gradients. The performance of the scheme is evaluated in terms of the convergence rate and DP level for a given MAC capacity. The performance is optimized over the choice of the quantization levels and the artificial noise parameters. Numerical evaluations are presented to validate the performance of the proposed scheme.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源