Table of Contents
Fetching ...
Paper

A Nonparametric Statistics Approach to Feature Selection in Deep Neural Networks with Theoretical Guarantees

Abstract

This paper tackles the problem of feature selection in a highly challenging setting: , where is the set of relevant features and is an unknown, potentially nonlinear function subject to mild smoothness conditions. Our approach begins with feature selection in deep neural networks, then generalizes the results to H{ö}lder smooth functions by exploiting the strong approximation capabilities of neural networks. Unlike conventional optimization-based deep learning methods, we reformulate neural networks as index models and estimate using the second-order Stein's formula. This gradient-descent-free strategy guarantees feature selection consistency with a sample size requirement of , where is the feature dimension. To handle high-dimensional scenarios, we further introduce a screening-and-selection mechanism that achieves nonlinear selection consistency when , with representing the sparsity level. Additionally, we refit a neural network on the selected features for prediction and establish performance guarantees under a relaxed sparsity assumption. Extensive simulations and real-data analyses demonstrate the strong performance of our method even in the presence of complex feature interactions.