论文标题

在rényi横向上

On the Rényi Cross-Entropy

论文作者

Thierrin, Ferenc Cole, Alajaji, Fady, Linder, Tamás

论文摘要

最近,两个分布之间的横向透明度测量(Shannon跨凝结的概括)最近被用作改善深度学习生成对抗网络设计的损失函数。在这项工作中,我们检查了该度量的属性,并在固定分布和两个分布属于指数族时得出封闭形式的表达式。我们还通过分析确定了固定高斯工艺的跨凝结速率和有限的阿尔如本字母马尔可夫来源的公式。

The Rényi cross-entropy measure between two distributions, a generalization of the Shannon cross-entropy, was recently used as a loss function for the improved design of deep learning generative adversarial networks. In this work, we examine the properties of this measure and derive closed-form expressions for it when one of the distributions is fixed and when both distributions belong to the exponential family. We also analytically determine a formula for the cross-entropy rate for stationary Gaussian processes and for finite-alphabet Markov sources.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源