Paper ID | P.4.4 | ||
Paper Title | A Better Bound Gives a Hundred Rounds: Enhanced Privacy Guarantees via f-Divergences | ||
Authors | Shahab Asoodeh, Harvard University, United States; Jiachun Liao, Arizona State University, United States; Flavio P. Calmon, Harvard University, United States; Oliver Kosut, Lalitha Sankar, Arizona State University, United States | ||
Session | P.4: Information Privacy II | ||
Presentation | Lecture | ||
Track | Cryptography, Security and Privacy | ||
Manuscript | Click here to download the manuscript | ||
Virtual Presentation | Click here to watch in the Virtual Symposium | ||
Abstract | We derive the optimal differential privacy (DP) parameters of a mechanism that satisfies a given level of Renyi differential privacy (RDP). Our result is based on the joint range of two f-divergences that underlie the approximate and the Renyi variations of differential privacy. We apply our result to the moments accountant framework for characterizing privacy guarantees of stochastic gradient descent. When compared to the state-of-the-art, our bounds may lead to about 100 more stochastic gradient descent iterations for training deep learning models for the same privacy budget. |
Plan Ahead
2021 IEEE International Symposium on Information Theory
11-16 July 2021 | Melbourne, Victoria, Australia