|Hypothesis Testing Against Independence Under Gaussian Noise
|Abdellatif Zaidi, Universite Paris-Est, France
|E.6: Hypothesis Testing II
|Detection and Estimation
|Click here to download the manuscript
|Click here to watch in the Virtual Symposium
|We study a variant of the many-help one hypothesis testing against independence problem in which the source, not necessarily Gaussian, has finite differential entropy and the observation noises under the null hypothesis are Gaussian. Under the criterion that stipulates minimization of the Type II error exponent subject to a (constant) bound $\epsilon$ on the Type I error rate, we derive an upper bound on the exponent-rates function. The bound is shown to mirror a corresponding explicit lower bound, except that the lower bound involves the source power (variance) whereas the upper bound has the source entropy power. Part of the utility of the established bound is for investigating asymptotic exponent/rates and losses incurred by distributed detection as function of the number of observations.