Paper ID | L.13.3 | ||
Paper Title | An Efficient Running Quantile Estimation Technique alongside Correntropy for Outlier Rejection in Online Regression | ||
Authors | Sajjad Bahrami, Ertem Tuncel, University of California, Riverside, United States | ||
Session | L.13: Online, Active, and Transfer Learning | ||
Presentation | Lecture | ||
Track | Statistics and Learning Theory | ||
Manuscript | Click here to download the manuscript | ||
Virtual Presentation | Click here to watch in the Virtual Symposium | ||
Abstract | In this paper, online linear regression in the presence of non-Gaussian noise is addressed. In such environments, there are outliers in error samples (error between system output and labels) and/or the error does not follow a Gaussian distribution. Information theoretic measures such as error entropy criterion (EEC) and error correntropy criterion (ECC) are known for their superior performance compared to mean square error (MSE) in these cases. In this paper, an efficient technique of running quantile estimation based on quantization of error samples is introduced alongside which correntropy leads to lower steady state misalignment compared to previous algorithms. |
Plan Ahead
2021 IEEE International Symposium on Information Theory
11-16 July 2021 | Melbourne, Victoria, Australia