Computationally Efficient Feature Significance and Importance for Machine Learning Models

05/23/2019
by   Enguerrand Horel, et al.
0

We develop a simple and computationally efficient significance test for the features of a machine learning model. Our forward-selection approach applies to any model specification, learning task and variable type. The test is non-asymptotic, straightforward to implement, and does not require model refitting. It identifies the statistically significant features as well as feature interactions of any order in a hierarchical manner, and generates a model-free notion of feature importance. Numerical results illustrate its performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset