论文标题

em在高斯潜在树模型中的收敛

EM's Convergence in Gaussian Latent Tree Models

论文作者

Dagan, Yuval, Daskalakis, Constantinos, Kandiros, Anthimos Vardis

论文摘要

我们研究了潜在的高斯树模型中对数似然函数的优化景观以及期望最大化(EM)算法的收敛性,即树结构的高斯图形模型,其叶子节点是可观察到的,并且不可观察到。我们表明,人口对数可能的唯一非平凡的固定点是其全球最大值,并确定在单个潜在变量情况下保证了期望最大化算法会融合到它。我们在一般潜在树模型中对数类样函数的景观的结果为在这种情况下的大量实际使用最大似然的方法提供了支持。我们对EM算法的结果扩展了一项新兴的工作,以获得该著名算法的全球融合保证。我们通过争辩说,从EM更新获得的一定多项式方程系统具有独特的非平凡解决方案,我们证明了对数样式的非平凡固定点的结果。 EM算法的全局收敛性通过争辩说所有微不足道的固定点都是高阶鞍点。

We study the optimization landscape of the log-likelihood function and the convergence of the Expectation-Maximization (EM) algorithm in latent Gaussian tree models, i.e. tree-structured Gaussian graphical models whose leaf nodes are observable and non-leaf nodes are unobservable. We show that the unique non-trivial stationary point of the population log-likelihood is its global maximum, and establish that the expectation-maximization algorithm is guaranteed to converge to it in the single latent variable case. Our results for the landscape of the log-likelihood function in general latent tree models provide support for the extensive practical use of maximum likelihood based-methods in this setting. Our results for the EM algorithm extend an emerging line of work on obtaining global convergence guarantees for this celebrated algorithm. We show our results for the non-trivial stationary points of the log-likelihood by arguing that a certain system of polynomial equations obtained from the EM updates has a unique non-trivial solution. The global convergence of the EM algorithm follows by arguing that all trivial fixed points are higher-order saddle points.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源