论文标题

差异私人在线到批处理,以平滑损失

Differentially Private Online-to-Batch for Smooth Losses

论文作者

Zhang, Qinzi, Tran, Hoang, Cutkosky, Ashok

论文摘要

我们开发了一种新的减少,该减少将遭受$ O(\ sqrt {t})$遗憾的任何在线凸优化算法转换为$ε$ - divertial私有随机凸优化算法,并具有最佳的收敛速率$ \ tilde o(1/\ sqrt $} $ smoper线性时间,与经典的非私人“在线到批量”转换形成直接类比。通过将我们的技术应用于更高级的自适应在线算法,我们生成了自适应差异私人对应物,其收敛率取决于APRIORI未知方差或参数规范。

We develop a new reduction that converts any online convex optimization algorithm suffering $O(\sqrt{T})$ regret into an $ε$-differentially private stochastic convex optimization algorithm with the optimal convergence rate $\tilde O(1/\sqrt{T} + \sqrt{d}/εT)$ on smooth losses in linear time, forming a direct analogy to the classical non-private "online-to-batch" conversion. By applying our techniques to more advanced adaptive online algorithms, we produce adaptive differentially private counterparts whose convergence rates depend on apriori unknown variances or parameter norms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源