论文标题

GTADAM:具有自适应动量的梯度跟踪,用于分布式在线优化

GTAdam: Gradient Tracking with Adaptive Momentum for Distributed Online Optimization

论文作者

Carnevale, Guido, Farina, Francesco, Notarnicola, Ivano, Notarstefano, Giuseppe

论文摘要

本文介绍了旨在以分布式方式解决在线优化问题的计算代理网络,即通过本地计算和通信,而没有任何中央协调员。我们提出了具有自适应动量估计(GTADAM)分布算法的梯度跟踪,该算法将梯度跟踪机理与梯度的第一和第二阶动量估计相结合。该算法在在线环境中进行了分析,以通过Lipschitz连续梯度凸出强烈的成本功能。我们为与初始条件有关的术语和与目标函数的时间变化有关的术语提供的动态遗憾提供了上限。此外,在静态设置中保证了线性收敛速率。该算法在时间变化的分类问题,(移动)目标定位问题以及图像分类的随机优化设置中进行了测试。在这些来自多机构学习的数值实验中,Gtadam优于最先进的分布式优化方法。

This paper deals with a network of computing agents aiming to solve an online optimization problem in a distributed fashion, i.e., by means of local computation and communication, without any central coordinator. We propose the gradient tracking with adaptive momentum estimation (GTAdam) distributed algorithm, which combines a gradient tracking mechanism with first and second order momentum estimates of the gradient. The algorithm is analyzed in the online setting for strongly convex cost functions with Lipschitz continuous gradients. We provide an upper bound for the dynamic regret given by a term related to the initial conditions and another term related to the temporal variations of the objective functions. Moreover, a linear convergence rate is guaranteed in the static setup. The algorithm is tested on a time-varying classification problem, on a (moving) target localization problem, and in a stochastic optimization setup from image classification. In these numerical experiments from multi-agent learning, GTAdam outperforms state-of-the-art distributed optimization methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源