论文标题

分层模型的局部增强界限的变化推断

Variational Inference with Locally Enhanced Bounds for Hierarchical Models

论文作者

Geffner, Tomas, Domke, Justin

论文摘要

分层模型代表了推理算法的挑战性设置。 MCMC方法难以扩展到具有许多局部变量和观测值的大型模型,并且由于使用简单的差异家庭,变化推理(VI)可能无法提供准确的近似值。某些变异方法(例如,重要性加权VI)整合了蒙特卡洛方法以提供更好的准确性,但是这些方法往往不适合分层模型,因为它们不允许亚采样,并且其性能往往会降低高维模型。我们根据分别针对每组局部随机变量的拧紧方法(例如重要性加权)的应用,为分层模型提出了一个新的差异界限家族。我们表明,我们的方法自然允许使用子采样来获得无偏的梯度,并且它通过独立于较低维的空间独立应用它们来实现更紧密的下限的方法,从而使得比相关基线更能更好地效果和更准确的后近似值。

Hierarchical models represent a challenging setting for inference algorithms. MCMC methods struggle to scale to large models with many local variables and observations, and variational inference (VI) may fail to provide accurate approximations due to the use of simple variational families. Some variational methods (e.g. importance weighted VI) integrate Monte Carlo methods to give better accuracy, but these tend to be unsuitable for hierarchical models, as they do not allow for subsampling and their performance tends to degrade for high dimensional models. We propose a new family of variational bounds for hierarchical models, based on the application of tightening methods (e.g. importance weighting) separately for each group of local random variables. We show that our approach naturally allows the use of subsampling to get unbiased gradients, and that it fully leverages the power of methods that build tighter lower bounds by applying them independently in lower dimensional spaces, leading to better results and more accurate posterior approximations than relevant baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源