论文标题

Barycenters的随机梯度下降

Stochastic Gradient Descent for Barycenters in Wasserstein Space

论文作者

Backhoff-Veraguas, Julio, Fontbona, Joaquin, Rios, Gonzalo, Tobar, Felipe

论文摘要

我们介绍并研究了一种用于计算欧几里得空间中绝对连续概率测量的2-wasserstein种群Barycenters的新算法。所提出的方法可以看作是2-wasserstein空间中的随机梯度下降程序,以及其中大量定律的表现。 The algorithm aims at finding a Karcher mean or critical point in this setting, and can be implemented ``online", sequentially using i.i.d. random measures sampled from the population law. We provide natural sufficient conditions for this algorithm to a.s. converge in the Wasserstein space towards the population barycenter, and we introduce a novel, general condition which ensures uniqueness of Karcher means and moreover allows us to obtain explicit,预期最佳差距的参数收敛率。 学习。

We present and study a novel algorithm for the computation of 2-Wasserstein population barycenters of absolutely continuous probability measures on Euclidean space. The proposed method can be seen as a stochastic gradient descent procedure in the 2-Wasserstein space, as well as a manifestation of a Law of Large Numbers therein. The algorithm aims at finding a Karcher mean or critical point in this setting, and can be implemented ``online", sequentially using i.i.d. random measures sampled from the population law. We provide natural sufficient conditions for this algorithm to a.s. converge in the Wasserstein space towards the population barycenter, and we introduce a novel, general condition which ensures uniqueness of Karcher means and moreover allows us to obtain explicit, parametric convergence rates for the expected optimality gap. We furthermore study the mini-batch version of this algorithm, and discuss examples of families of population laws to which our method and results can be applied. This work expands and deepens ideas and results introduced in an early version of \cite{backhoff2018bayesian}, in which a statistical application (and numerical implementation) of this method is developed in the context of Bayesian learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源