Technical Program

Paper Detail

Paper IDL.4.1
Paper Title Analysis of K Nearest Neighbor KL Divergence Estimation for Continuous Distributions
Authors Puning Zhao, Lifeng Lai, University of California Davis, United States
Session L.4: Distribution Learning
Presentation Lecture
Track Statistics and Learning Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract Estimating Kullback-Leibler divergence from identically and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!