论文标题
加速消息传递以熵登记的地图推理
Accelerated Message Passing for Entropy-Regularized MAP Inference
论文作者
论文摘要
在离散值马尔可夫随机字段中的最大后验(MAP)推断是机器学习中的一个基本问题,涉及确定给定分布的随机变量最有可能的配置。由于此组合问题的困难,线性编程(LP)松弛通常用于得出专业消息传递算法,这些消息通常被解释为双LP上的坐标下降。为了获得更理想的计算属性,许多方法将LP与熵术语进行正规化,从而导致一类平滑消息传递算法,并保证了收敛性。在本文中,我们提出了通过利用经典加速梯度方法的技术来加速这些算法的随机方法。所提出的算法结合了标准光滑消息传递算法的熟悉步骤,可以将其视为坐标最小化步骤。我们表明,这些加速变体的速度更快地找到了未注册的问题的$ε$ - 最佳点,并且当LP紧张时,我们证明所提出的算法在比标准消息传递算法的情况下更少的迭代中恢复了真实的MAP解决方案。
Maximum a posteriori (MAP) inference in discrete-valued Markov random fields is a fundamental problem in machine learning that involves identifying the most likely configuration of random variables given a distribution. Due to the difficulty of this combinatorial problem, linear programming (LP) relaxations are commonly used to derive specialized message passing algorithms that are often interpreted as coordinate descent on the dual LP. To achieve more desirable computational properties, a number of methods regularize the LP with an entropy term, leading to a class of smooth message passing algorithms with convergence guarantees. In this paper, we present randomized methods for accelerating these algorithms by leveraging techniques that underlie classical accelerated gradient methods. The proposed algorithms incorporate the familiar steps of standard smooth message passing algorithms, which can be viewed as coordinate minimization steps. We show that these accelerated variants achieve faster rates for finding $ε$-optimal points of the unregularized problem, and, when the LP is tight, we prove that the proposed algorithms recover the true MAP solution in fewer iterations than standard message passing algorithms.