论文标题
联合学习系统而无需通过集成尺寸缩小的数据表示形式进行模型共享
Federated Learning System without Model Sharing through Integration of Dimensional Reduced Data Representations
论文作者
论文摘要
降低降低是机器学习管道中常用的元素,有助于从高维数据中提取重要特征。在这项工作中,我们探索了一种替代联盟学习系统,该系统能够在监督学习任务之前降低分布式数据的表示形式,从而避免双方之间的模型共享。我们将这种方法在图像分类任务上的性能与三个替代框架进行比较:集中的机器学习,单独的机器学习和联合平均,并分析无模型共享的联合学习系统的潜在用例。我们的结果表明,我们的方法可以达到与联合平均值相似的准确性,并且比在小型用户设置中平均的平均表现更好。
Dimensionality Reduction is a commonly used element in a machine learning pipeline that helps to extract important features from high-dimensional data. In this work, we explore an alternative federated learning system that enables integration of dimensionality reduced representations of distributed data prior to a supervised learning task, thus avoiding model sharing among the parties. We compare the performance of this approach on image classification tasks to three alternative frameworks: centralized machine learning, individual machine learning, and Federated Averaging, and analyze potential use cases for a federated learning system without model sharing. Our results show that our approach can achieve similar accuracy as Federated Averaging and performs better than Federated Averaging in a small-user setting.