Technical Program

Paper Detail

Paper IDS.5.5
Paper Title Variable-Length Source Dispersions Differ under Maximum and Average Error Criteria
Authors Yuta Sakai, Vincent Y. F. Tan, National University of Singapore, Singapore
Session S.5: Error Exponents
Presentation Lecture
Track Shannon Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract Variable-length compression without prefix-free constraints and with side-information available at both encoder and decoder is considered. Instead of requiring the code to be error-free, we allow for it to have a non-vanishing error probability. We derive one-shot bounds on the optimal average codeword length by proposing two new information quantities; namely, the conditional and unconditional $\varepsilon$-cutoff entropies. Using these one-shot bounds, we obtain the second-order asymptotics of the problem under two different formalisms---the average and maximum probabilities of error with respect to the side-information. While the first-order terms in the asymptotic expansions for both formalisms are identical, we find that the source dispersion under the average error formalism is, in most cases, strictly smaller than its maximum counterpart. Applications to a certain class of guessing problems, previously studied by Kuzuoka [\emph{{IEEE} Trans.\ Inf. Theory}, vol.~66, no.~3, pp.~1674--1690, 2020], are also discussed.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!