On the Performance of Differential Evolution for Hyperparameter Tuning

04/15/2019
by   Mischa Schmidt, et al.
0

Automated hyperparameter tuning aspires to facilitate the application of machine learning for non-experts. In the literature, different optimization approaches are applied for that purpose. This paper investigates the performance of Differential Evolution for tuning hyperparameters of supervised learning algorithms for classification tasks. This empirical study involves a range of different machine learning algorithms and datasets with various characteristics to compare the performance of Differential Evolution with Sequential Model-based Algorithm Configuration (SMAC), a reference Bayesian Optimization approach. The results indicate that Differential Evolution outperforms SMAC for most datasets when tuning a given machine learning algorithm - particularly when breaking ties in a first-to-report fashion. Only for the tightest of computational budgets SMAC performs better. On small datasets, Differential Evolution outperforms SMAC by 19 tie-breaking). In a second experiment across a range of representative datasets taken from the literature, Differential Evolution scores 15 tie-breaking) more wins than SMAC.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset