Paper ID | L.10.2 | ||
Paper Title | Robust Generalization via f-Mutual Information | ||
Authors | Amedeo Roberto Esposito, Michael Gastpar, EPFL, Switzerland; Ibrahim Issa, American University of Beirut, Lebanon | ||
Session | L.10: Learning Theory I | ||
Presentation | Lecture | ||
Track | Statistics and Learning Theory | ||
Manuscript | Click here to download the manuscript | ||
Virtual Presentation | Click here to watch in the Virtual Symposium | ||
Abstract | Given two probability measures $P$ and $Q$ and an event $E$, we provide bounds on $P(E)$ in terms of $Q(E)$ and $f-$divergences. In particular, the bounds are instantiated when the measures considered are a joint distribution and the corresponding product of marginals. This allows us to control the measure of an event under the joint, using the product of the marginals (typically easier to compute) and a measure of how much the two distributions differ, \textit{i.e.,} an $f-$divergence between the joint and the product of the marginals, also known in the literature as $f-$Mutual Information. The result is general enough to induce, as special cases, bounds involving $\chi^2$-divergence, Hellinger distance, Total Variation, etc. Moreover, it also recovers a result involving R\'enyi's $\alpha-$divergence. As an application, we provide bounds on the generalization error of learning algorithms via $f-$divergences. |
Plan Ahead
2021 IEEE International Symposium on Information Theory
11-16 July 2021 | Melbourne, Victoria, Australia