|Learning Additive Noise Channels: Generalization Bounds and Algorithms
|Nir Weinberger, Massachusetts Institute of Technology, United States
|L.4: Distribution Learning
|Statistics and Learning Theory
|Click here to download the manuscript
|Click here to watch in the Virtual Symposium
|An additive noise channel is considered, in which the noise distribution is unknown and does not known to belong to any parametric family. The problem of designing a codebook and a generalized minimal distance decoder (which is parameterized by a covariance matrix) based on samples of the noise is considered. High probability generalization bounds for the error probability loss function, as well as for a hinge-type surrogate loss function are provided. A stochastic-gradient based alternating-minimization algorithm for the latter loss function is presented. Bounds on the average empirical error and generalization error are provided for a Gibbs based algorithm that gradually expurgates codewords from a large initial codebook to obtain a smaller codebook with improved error probability.