论文标题

黑盒设置中的rényi差异隐私的下限

Lower Bounds for Rényi Differential Privacy in a Black-Box Setting

论文作者

Kutta, Tim, Askin, Önder, Dunsche, Martin

论文摘要

我们提出了评估Rényi差异隐私算法的隐私保证的新方法。据我们所知,这项工作是第一个在黑框方案中解决此问题的工作,在黑盒子方案中,只有算法输出可用。为了量化隐私泄漏,我们为一对输出分布的Rényi发散设计了一个新的估计器。该估计器被转变为统计下限,该估计量被证明可以适用于具有很高概率的大型样品。我们的方法适用于广泛的算法,包括来自隐私文献中的许多众所周知的示例。我们通过包括在相关工作中未考虑的算法和隐私增强方法的实验来证明我们的方法的有效性。

We present new methods for assessing the privacy guarantees of an algorithm with regard to Rényi Differential Privacy. To the best of our knowledge, this work is the first to address this problem in a black-box scenario, where only algorithmic outputs are available. To quantify privacy leakage, we devise a new estimator for the Rényi divergence of a pair of output distributions. This estimator is transformed into a statistical lower bound that is proven to hold for large samples with high probability. Our method is applicable for a broad class of algorithms, including many well-known examples from the privacy literature. We demonstrate the effectiveness of our approach by experiments encompassing algorithms and privacy enhancing methods that have not been considered in related works.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源