Technical Program

Paper Detail

Paper IDL.2.4
Paper Title Analytic Study of Double Descent in Binary Classification: The Impact of Loss
Authors Ganesh Ramachandra Kini, Christos Thrampoulidis, University of California, Santa Barbara, United States
Session L.2: Classification
Presentation Lecture
Track Statistics and Learning Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract Extensive empirical evidence reveals that, for a wide range of different learning methods and data sets, the risk curve exhibits a double-descent (DD) trend as a function of the model size. In our recent coauthored paper [Deng et al., '19], we proposed simple binary linear classification models and showed that the test error of gradient descent (GD) with logistic loss undergoes a DD. In this paper, we complement these results by extending them to GD with square loss. We show that the DD phenomenon persists, but we also identify several differences compared to logistic loss. This emphasizes that crucial features of DD curves (such as their transition threshold and global minima) depend both on the training data and on the learning algorithm. We further study the dependence of DD curves on the size of the training set. Similar to [Deng et al., '19] our results are analytic: we plot the DD curves by first deriving sharp asymptotics for the test error under Gaussian features. Albeit simple, the models permit a principled study, the outcomes of which theoretically corroborate related empirical findings occurring in more complex learning tasks.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!