论文标题

有条件的深高斯流程:多余的内核学习

Conditional Deep Gaussian Processes: multi-fidelity kernel learning

论文作者

Lu, Chi-Ken, Shafto, Patrick

论文摘要

提出了深层高斯工艺(DGP)作为一种能够基于数学基础的不确定性估计的表达贝叶斯模型。 DPG的表达不仅源于组成特征,还来自层次结构内的分布传播。最近,[1]指出,DGP非常适合建模多保真回归的层次结构,其中提供了稀疏观测值,具有很高的精度和大量的低忠诚度观测值。我们提出了有条件的DGP模型,其中潜在的GP由固定的低忠诚度数据直接支持。然后,将[2]中的矩匹配方法应用于与GP的条件DGP的边缘先验。获得的有效核是低保真数据的隐式函数,表现出层次结构内分布传播的贡献。通过优化近似边缘可能性来学习超参数。合成和高维数据的实验表明,与其他多效回归方法,变异推理和多输出GP相比的性能可比性。我们得出的结论是,借助低忠诚度数据和分层DGP结构,有效的内核编码了真实功能的电感偏差,从而允许[3,4]中讨论的组成自由。

Deep Gaussian Processes (DGPs) were proposed as an expressive Bayesian model capable of a mathematically grounded estimation of uncertainty. The expressivity of DPGs results from not only the compositional character but the distribution propagation within the hierarchy. Recently, [1] pointed out that the hierarchical structure of DGP well suited modeling the multi-fidelity regression, in which one is provided sparse observations with high precision and plenty of low fidelity observations. We propose the conditional DGP model in which the latent GPs are directly supported by the fixed lower fidelity data. Then the moment matching method in [2] is applied to approximate the marginal prior of conditional DGP with a GP. The obtained effective kernels are implicit functions of the lower-fidelity data, manifesting the expressivity contributed by distribution propagation within the hierarchy. The hyperparameters are learned via optimizing the approximate marginal likelihood. Experiments with synthetic and high dimensional data show comparable performance against other multi-fidelity regression methods, variational inference, and multi-output GP. We conclude that, with the low fidelity data and the hierarchical DGP structure, the effective kernel encodes the inductive bias for true function allowing the compositional freedom discussed in [3,4].

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源