Technical Program

Paper Detail

Paper IDN.4.4
Paper Title Successive Refinement to Caching for Dynamic Requests
Authors Pinar Sen, University of California, San Diego, United States; Michael Gastpar, Ecole Polytechnique Federale de Lausanne, Switzerland; Young-Han Kim, University of California, San Diego, United States
Session N.4: Coded Caching II
Presentation Lecture
Track Networking and Network Coding
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract In the celebrated coded caching problem studied by Maddah-Ali and Niesen, the peak-traffic network load is to be reduced by first caching some information about contents into individual memories of end users during the off-peak hours and then upon user requests broadcasting some other information about the contents, which, combined with cached information, can let each user recover their requested content. Thus, information-theoretic studies of coded caching involve the optimal tradeoff among communication rates for the two phases of cache placement and content delivery, and the optimal construction of codes for cache placement and content delivery. In order to allow better utilization of network resources, this paper introduces a new caching model in which user requests can arise at any point of time during the cache placement phase, and proposes a successive refinement approach as an answer to this \emph{dynamic} caching problem. For uniformly random file requests, the optimal tradeoff among average-case delivery rates are characterized when the cache rate is above a well-defined threshold. For arbitrary file requests, a successive caching algorithm is developed to simultaneously reduce worst-case delivery rates at every request time, which are uniformly within a constant multiplicative factor of their respective optima.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!