Technical Program

Paper Detail

Paper IDL.13.2
Paper Title Rényi Entropy Bounds on the Active Learning Cost-Performance Tradeoff
Authors Vahid Jamali, University of Erlangen-Nuremberg, Germany, Germany; Antonia Tulino, University of Napoli Federico II, Italy, Italy; Jaime Llorca, Elza Erkip, New York University, New York, United States
Session L.13: Online, Active, and Transfer Learning
Presentation Lecture
Track Statistics and Learning Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract Semi-supervised classification, one of the most prominent fields in machine learning, studies how to combine the statistical knowledge of the often abundant unlabeled data with the often limited labeled data in order to maximize overall classification accuracy. In this context, the process of actively choosing the data to be labeled is referred to as active learning. In this paper, we initiate the non-asymptotic analysis of the optimal policy for semi-supervised classification with actively obtained labeled data. Considering a general Bayesian classification model, we provide the first characterization of the jointly optimal active learning and semi-supervised classification policy, in terms of the cost-performance tradeoff driven by the label query budget (number of data items to be labeled) and overall classification accuracy. Leveraging recent results on the Rényi Entropy, we derive tight information-theoretic bounds on such active learning cost-performance tradeoff.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!