Technical Program

Paper Detail

Paper IDL.4.5
Paper Title Learning Additive Noise Channels: Generalization Bounds and Algorithms
Authors Nir Weinberger, Massachusetts Institute of Technology, United States
Session L.4: Distribution Learning
Presentation Lecture
Track Statistics and Learning Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract An additive noise channel is considered, in which the noise distribution is unknown and does not known to belong to any parametric family. The problem of designing a codebook and a generalized minimal distance decoder (which is parameterized by a covariance matrix) based on samples of the noise is considered. High probability generalization bounds for the error probability loss function, as well as for a hinge-type surrogate loss function are provided. A stochastic-gradient based alternating-minimization algorithm for the latter loss function is presented. Bounds on the average empirical error and generalization error are provided for a Gibbs based algorithm that gradually expurgates codewords from a large initial codebook to obtain a smaller codebook with improved error probability.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!