Error-Based Knockoffs Inference for Controlled Feature Selection
Xuebin Zhao, Hong Chen, Yingjie Wang, Weifu Li, Tieliang Gong, Yulong Wang, Feng Zheng
[AAAI-22] Main Track
Abstract:
Recently, the scheme of model-X knockoffs was proposed as a promising solution to address controlled feature selection under high-dimensional finite-sample settings. However, the procedure of model-X knockoffs depends heavily on the coefficient-based feature importance and only concerns the control of false discovery rate (FDR). To further improve its adaptivity and flexibility, in this paper, we propose an error-based knockoff inference method by integrating the knockoff features, the error-based feature importance statistics, and the stepdown procedure together. The proposed inference procedure does not require specifying a regression model and can handle feature selection with theoretical guarantees on controlling false discovery proportion (FDP), FDR, or k-familywise error rate (k-FWER). Empirical evaluations demonstrate the competitive performance of our approach on both simulated and real data.
Introduction Video
Sessions where this paper appears
-
Poster Session 5
Sat, February 26 12:45 AM - 2:30 AM (+00:00)
Blue 1
-
Poster Session 10
Sun, February 27 4:45 PM - 6:30 PM (+00:00)
Blue 1