Technical Program

Paper Detail

Paper IDL.4.3
Paper Title On Learning Parametric Non-Smooth Continuous Distributions
Authors Sudeep Kamath, PDT Partners, United States; Alon Orlitsky, University of California San Diego, United States; Venkatadheeraj Pichapati, Apple Inc., United States; Ehsan Zobeidi, University of California San Diego, United States
Session L.4: Distribution Learning
Presentation Lecture
Track Statistics and Learning Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract With the eventual goal of better understanding learning rates of general continuous distributions, we derive the first essentially min-max optimal estimators and learning rates for several natural classes of parametric non-smooth continuous distributions under KL divergence. In particular, we show that unlike the folk theorem of 1/2n learning-rate increase per distribution parameter, non-smooth distribution exhibit a wide range of learning rates.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!