Technical Program

Paper Detail

Paper IDL.9.4
Paper Title Max-affine Regression with Universal Parameter Estimation for Small-ball Designs
Authors Avishek Ghosh, Ashwin Pananjady, Kannan Ramchandran, Aditya Guntuboyina, University of California, Berkeley, United States
Session L.9: Learning Methods and Networks
Presentation Lecture
Track Statistics and Learning Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract We study the max-affine regression model, where the unknown regression function is modeled as a maximum of a fixed number of affine functions. In recent work~\cite{ghosh19ma}, we showed that end-to-end parameter estimates were obtainable using this model with an alternating minimization (AM) algorithm provided the covariates (or designs) were normally distributed, and chosen independently of the underlying parameters. In this paper, we show that AM is significantly more robust than the setting of ~\cite{ghosh19ma}: It converges locally under small-ball design assumptions (which is a much broader class, including bounded log-concave distributions), and even when the underlying parameters are chosen with knowledge of the realized covariates. Once again, the final rate obtained by the procedure is near-parametric and minimax optimal (up to a polylogarithmic factor) as a function of the dimension, sample size, and noise variance. As a by-product of our analysis, we obtain convergence guarantees on a classical algorithm for the (real) phase retrieval problem in the presence of noise under considerably weaker assumptions on the design distribution than was previously known.

Plan Ahead

IEEE ISIT 2021

2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!