论文标题
基于得分的生成建模秘密地最大程度地减少了瓦斯坦的距离
Score-based Generative Modeling Secretly Minimizes the Wasserstein Distance
论文作者
论文摘要
基于分数的生成模型显示出在各种应用中(例如图像产生和音频合成)中实现出色的经验性能。但是,对基于得分的扩散模型的理论理解仍然不完整。最近,Song等。表明,基于得分的生成模型的训练目标相当于最大程度地减少与数据分布生成的分布的kullback-leibler差异。在这项工作中,我们表明基于分数的模型还将其在模型上合适的假设下的Wasserstein距离最小化。具体而言,我们证明了Wasserstein距离是由目标函数的平方根到乘法常数和固定常数偏移的上限。我们的证明是基于最佳运输理论的新颖应用,这可能是社会独立的利益。我们的数值实验支持我们的发现。通过分析我们的上限,我们提供了一些技术来获得更紧密的上限。
Score-based generative models are shown to achieve remarkable empirical performances in various applications such as image generation and audio synthesis. However, a theoretical understanding of score-based diffusion models is still incomplete. Recently, Song et al. showed that the training objective of score-based generative models is equivalent to minimizing the Kullback-Leibler divergence of the generated distribution from the data distribution. In this work, we show that score-based models also minimize the Wasserstein distance between them under suitable assumptions on the model. Specifically, we prove that the Wasserstein distance is upper bounded by the square root of the objective function up to multiplicative constants and a fixed constant offset. Our proof is based on a novel application of the theory of optimal transport, which can be of independent interest to the society. Our numerical experiments support our findings. By analyzing our upper bounds, we provide a few techniques to obtain tighter upper bounds.