论文标题

多军匪徒的分布式差异隐私

Distributed Differential Privacy in Multi-Armed Bandits

论文作者

Chowdhury, Sayak Ray, Zhou, Xingyu

论文摘要

我们考虑在分布式差异隐私(DP)下考虑标准的$ k $武装匪徒问题,该问题使得无需可信赖的服务器保证隐私。在此信任模型下,以前的工作主要集中在使用Shuffle协议中实现隐私,在该协议中,在发送到中央服务器之前,将一批用户数据随机排列。该协议通过牺牲额外的添加剂$ o \!\ left(\!\ frac {k \ log t \ sqrt {\ log(1/δ)}}ε\!\!\ right)\!$ in $ t $ - tep。相比之下,在广泛使用的中央信任模型下实现更强($ε,0 $)或纯dp保证的最佳隐私成本仅为$θ\!\!在这项工作中,我们旨在获得分布式信任模型下的纯DP保证,同时牺牲比中央信托模型下的遗憾。我们通过基于连续的ARM消除设计通用的匪徒算法来实现这一目标,在这种情况下,通过使用安全的计算协议确保使用等效的离散拉普拉斯噪声来损坏奖励来保证隐私。我们还表明,当使用Skellam噪声和安全协议实例化时,我们的算法可确保\ emph {rényi差异隐私} - 在分布式信任模型下具有$ o \!\!\ left(\!

We consider the standard $K$-armed bandit problem under a distributed trust model of differential privacy (DP), which enables to guarantee privacy without a trustworthy server. Under this trust model, previous work largely focus on achieving privacy using a shuffle protocol, where a batch of users data are randomly permuted before sending to a central server. This protocol achieves ($ε,δ$) or approximate-DP guarantee by sacrificing an additional additive $O\!\left(\!\frac{K\log T\sqrt{\log(1/δ)}}ε\!\right)\!$ cost in $T$-step cumulative regret. In contrast, the optimal privacy cost for achieving a stronger ($ε,0$) or pure-DP guarantee under the widely used central trust model is only $Θ\!\left(\!\frac{K\log T}ε\!\right)\!$, where, however, a trusted server is required. In this work, we aim to obtain a pure-DP guarantee under distributed trust model while sacrificing no more regret than that under central trust model. We achieve this by designing a generic bandit algorithm based on successive arm elimination, where privacy is guaranteed by corrupting rewards with an equivalent discrete Laplace noise ensured by a secure computation protocol. We also show that our algorithm, when instantiated with Skellam noise and the secure protocol, ensures \emph{Rényi differential privacy} -- a stronger notion than approximate DP -- under distributed trust model with a privacy cost of $O\!\left(\!\frac{K\sqrt{\log T}}ε\!\right)\!$.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源