Technical Program

Paper Detail

Paper IDE.6.2
Paper Title Hypothesis Testing Against Independence Under Gaussian Noise
Authors Abdellatif Zaidi, Universite Paris-Est, France
Session E.6: Hypothesis Testing II
Presentation Lecture
Track Detection and Estimation
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract We study a variant of the many-help one hypothesis testing against independence problem in which the source, not necessarily Gaussian, has finite differential entropy and the observation noises under the null hypothesis are Gaussian. Under the criterion that stipulates minimization of the Type II error exponent subject to a (constant) bound $\epsilon$ on the Type I error rate, we derive an upper bound on the exponent-rates function. The bound is shown to mirror a corresponding explicit lower bound, except that the lower bound involves the source power (variance) whereas the upper bound has the source entropy power. Part of the utility of the established bound is for investigating asymptotic exponent/rates and losses incurred by distributed detection as function of the number of observations.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!