论文标题
FedCD:提高非IID联合学习的表现
FedCD: Improving Performance in non-IID Federated Learning
论文作者
论文摘要
联合学习已被广泛应用于启用分散的设备,每个设备都有自己的本地数据,以学习共享模型。但是,从现实世界中的数据学习可能具有挑战性,因为它很少在边缘设备上相同和独立分布(IID)(当前高性能和低频带宽度算法的关键假设)。我们提出了一种新颖的方法,即FedCD,该方法将模型和删除模型删除,以动态分组具有相似数据的设备。在CIFAR-10数据集上的实验表明,与非IID数据的FedAvg基线相比,FedCD可以达到更高的准确性和更快的收敛性,同时产生了最小的计算,通信和存储开销。
Federated learning has been widely applied to enable decentralized devices, which each have their own local data, to learn a shared model. However, learning from real-world data can be challenging, as it is rarely identically and independently distributed (IID) across edge devices (a key assumption for current high-performing and low-bandwidth algorithms). We present a novel approach, FedCD, which clones and deletes models to dynamically group devices with similar data. Experiments on the CIFAR-10 dataset show that FedCD achieves higher accuracy and faster convergence compared to a FedAvg baseline on non-IID data while incurring minimal computation, communication, and storage overheads.