Technical Program

Paper Detail

Paper IDL.10.6
Paper Title Exploring Unique Relevance for Mutual Information based Feature Selection
Authors Shiyu Liu, Mehul Motani, National University of Singapore, Singapore
Session L.10: Learning Theory I
Presentation Lecture
Track Statistics and Learning Theory
Manuscript  Click here to download the manuscript
Virtual Presentation  Click here to watch in the Virtual Symposium
Abstract Mutual Information (MI), a measure from information theory, is widely used in feature selection. Despite its great success, a promising feature property, namely the unique relevance (UR) of a feature, remains unexplored. In this paper, we improve the performance of mutual information based feature selection (MIBFS) by exploring the utility of unique relevance (UR). We provide a theoretical justification for the value of UR and prove that the optimal feature subset must contain all features with UR. Since existing MIBFS follows the criterion of Maximize Relevance with Minimum Redundancy (MRwMR) which ignores UR of features, we augment it to include the objective of boosting unique relevance (BUR). This leads to a new criterion for MIBFS, called MRwMR-BUR. We conduct experiments on six public datasets and the results indicate that MRwMR-BUR consistently outperforms MRwMR when tested with three popular classifiers. We believe this new insight can lead to new optimality bounds and algorithms.

Plan Ahead


2021 IEEE International Symposium on Information Theory

11-16 July 2021 | Melbourne, Victoria, Australia

Visit Website!