|Analysis of K Nearest Neighbor KL Divergence Estimation for Continuous Distributions
|Puning Zhao, Lifeng Lai, University of California Davis, United States
|L.4: Distribution Learning
|Statistics and Learning Theory
|Click here to download the manuscript
|Click here to watch in the Virtual Symposium
|Estimating Kullback-Leibler divergence from identically and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator.