论文标题

梯度跟踪:一种平滑分布式优化的统一方法

Gradient Tracking: A Unified Approach to Smooth Distributed Optimization

论文作者

Li, Jingwang, Su, Housheng

论文摘要

在这项工作中,我们研究了对挖掘的经典分布式优化问题,其中目标函数是平稳的局部函数的总和。受到我们早期工作中提出的隐式跟踪机制的启发,我们从纯原始的角度(即UGT)开发了统一的算法框架,该框架本质上是一种广义的梯度跟踪方法,可以统一具有恒定步骤尺寸的大多数现有的分布式优化算法。事实证明,如果全局目标函数强烈凸出,UGT的两个变体都可以实现线性收敛。最后,通过数值实验评估UGT的性能。

In this work, we study the classical distributed optimization problem over digraphs, where the objective function is a sum of smooth local functions. Inspired by the implicit tracking mechanism proposed in our earlier work, we develop a unified algorithmic framework from a pure primal perspective, i.e., UGT, which is essentially a generalized gradient tracking method and can unify most existing distributed optimization algorithms with constant step-sizes. It is proved that two variants of UGT can both achieve linear convergence if the global objective function is strongly convex. Finally, the performance of UGT is evaluated by numerical experiments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源