Technical Program

Paper Detail

Paper IDS.10.4
Paper Title Complexity of Estimating Renyi Entropy of Markov Chains
Authors Maciej Skorski, University of Luxembourg, Luxembourg; Maciej Obremski, National University of Singapore, Singapore
Session S.10: Information Measures II
Presentation Lecture
Track Shannon Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract Estimating entropy of random processes is one of the fundamental problems of machine learning and property testing. It has numerous applications to anything from DNA testing and predictability of human behaviour to modeling neural activity and cryptography. We investigate the problem of R\'{e}nyi entropy estimation for sources that form Markov chains. Kamath and Verdú (ISIT'16) showed that good mixing properties are essential for that task. We prove that even with very good mixing time, estimation of entropy of order $\alpha>1$ requires $\Omega(K^{2-1/\alpha})$ samples, where $K$ is the size of the alphabet; particularly min-entropy requires $\Omega(K^2)$ sample size and collision entropy requires $\Omega(K^{3/2})$ samples. Our results hold both in asymptotic and non-asymptotic regimes (under mild restrictions). The analysis is completed by the upper complexity bound of $O(K^2)$ for the standard plug-in estimator. This leads to an interesting open question how to improve upon a plugin estimator, which looks much more challenging than for IID sources (which tensorize nicely). We achieve the results by applying Le Cam's method to two Markov chains which differ by an appropriately chosen sparse perturbation; the discrepancy between these chains is estimated with help of perturbation theory. Our techniques might be of independent interest.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!