论文标题
在联邦学习中进行有效的沟通:当代调查
Towards Efficient Communications in Federated Learning: A Contemporary Survey
论文作者
论文摘要
在传统的分布式机器学习方案中,用户的私人数据是在客户和中央服务器之间传输的,这会带来巨大的潜在隐私风险。为了平衡数据隐私问题和模型的联合培训,提议将联邦学习(FL)作为特定的分布式机器学习程序,并具有隐私保护机制,可以实现多方协作计算,而无需透露原始数据。但是,实际上,FL面临着各种具有挑战性的沟通问题。这篇综述旨在通过从三个角度有条不紊地评估FL沟通研究的发展:沟通效率,沟通环境和沟通资源分配,来阐明这些交流问题之间的关系。首先,我们解决了佛罗里达州通信中现有的当前挑战。其次,我们已经整理了与FL通信相关的论文,并根据其逻辑关系描述了该领域的整体发展趋势。最终,我们讨论了佛罗里达州沟通研究的未来方向。
In the traditional distributed machine learning scenario, the user's private data is transmitted between clients and a central server, which results in significant potential privacy risks. In order to balance the issues of data privacy and joint training of models, federated learning (FL) is proposed as a particular distributed machine learning procedure with privacy protection mechanisms, which can achieve multi-party collaborative computing without revealing the original data. However, in practice, FL faces a variety of challenging communication problems. This review seeks to elucidate the relationship between these communication issues by methodically assessing the development of FL communication research from three perspectives: communication efficiency, communication environment, and communication resource allocation. Firstly, we sort out the current challenges existing in the communications of FL. Second, we have collated FL communications-related papers and described the overall development trend of the field based on their logical relationship. Ultimately, we discuss the future directions of research for communications in FL.