Dropping forward-backward algorithms for feature selection
In this era of big data, feature selection techniques, which have long been proven to simplify the model, makes the model more comprehensible, speed up the process of learning, have become more and more important. Among many developed methods, forward, backward and stepwise feature selection regression remained widely used due to their simplicity and efficiency. However, they are not sufficient enough when it comes to large datasets. In this paper, we analyze the issues associated with those approaches and introduce a novel algorithm that may boost the speed up to 23.11 good performance compared to stepwise regression in terms of error sum of squares and number of selected features.
READ FULL TEXT