|Approximate Gács-Körner Common Information
|Salman Salamatian, MIT, United States; Asaf Cohen, Ben Gurion University of the Negev, Israel; Muriel Médard, MIT, United States
|S.9: Information Measures I
|Click here to download the manuscript
|Click here to watch in the Virtual Symposium
|We propose to exploit the structure of the correlation between two random variables X and Y via a relaxation on the Common Information problem of Gács and Körner (GK Common Information). Consider two correlated sources X and Y generated from a joint distribution P_X;Y . We consider embeddings of X into discrete random variables U, such that H(U|Y )<∂, while maximizing I(X;U). When ∂= 0, this reduces to the GK Common Information problem. However, unlike the GK Common Information, which is known to be zero for many pairs of random variables (X; Y ), we show that this relaxation allows to capture the structure in the correlation between X and Y for a much broader range of joint distributions, and showcase applications for some problems in multi-terminal information theory.