Paper ID | G.3.2 | ||
Paper Title | Some Performance Guarantees of Global LASSO with Local Assumptions for Convolutional Sparse Design Matrices | ||
Authors | Avishek Ghosh, Kannan Ramachandran, University of California, Berkeley, United States | ||
Session | G.3: Compressed Sensing | ||
Presentation | Lecture | ||
Track | Graphs, Games, Sparsity, and Signal Processing | ||
Manuscript | Click here to download the manuscript | ||
Virtual Presentation | Click here to watch in the Virtual Symposium | ||
Abstract | We analyze the performance of the LASSO algorithm (basis pursuit, Tibshirani et. al, '96) for a class of structured matrices dubbed as convolutional sparse matrix. Analyzing such matrices is of paramount interest since in many signal processing applications (including computer vision, image and audio processing), a global analysis of the underlying signal often entails understanding the behavior of convolutional sparse matrix. We show that LASSO ($\ell_1$ regularized least squares) with such matrices succeeds under a constraint on local sparsity, as opposed to global sparsity. This conversion from global to local constraint has crucial significance in the above mentioned applications. Under sufficiency conditions like Restricted Eigen-value (RE) and Exact Recovery Coefficient (ERC), we obtain the prediction (in $\ell_2$ norm) and estimation error rate for LASSO estimator with local constraints. Furthermore, we obtain an estimation error rate for LASSO estimator in $\ell_{\infty}$ norm under a gaussian noise model. |
Plan Ahead
2021 IEEE International Symposium on Information Theory
11-16 July 2021 | Melbourne, Victoria, Australia