Technical Program

Paper Detail

Paper IDS.9.4
Paper Title Approximate Gács-Körner Common Information
Authors Salman Salamatian, MIT, United States; Asaf Cohen, Ben Gurion University of the Negev, Israel; Muriel Médard, MIT, United States
Session S.9: Information Measures I
Presentation Lecture
Track Shannon Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract We propose to exploit the structure of the correlation between two random variables X and Y via a relaxation on the Common Information problem of Gács and Körner (GK Common Information). Consider two correlated sources X and Y generated from a joint distribution P_X;Y . We consider embeddings of X into discrete random variables U, such that H(U|Y )<∂, while maximizing I(X;U). When ∂= 0, this reduces to the GK Common Information problem. However, unlike the GK Common Information, which is known to be zero for many pairs of random variables (X; Y ), we show that this relaxation allows to capture the structure in the correlation between X and Y for a much broader range of joint distributions, and showcase applications for some problems in multi-terminal information theory.

Plan Ahead

IEEE ISIT 2021

2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!