DocumentCode :
3724165
Title :
Sequential Model-Free Hyperparameter Tuning
Author :
Martin Wistuba;Nicolas Schilling;Lars Schmidt-Thieme
Author_Institution :
Inf. Syst. &
fYear :
2015
Firstpage :
1033
Lastpage :
1038
Abstract :
Hyperparameter tuning is often done manually but current research has proven that automatic tuning yields effective hyperparameter configurations even faster and does not require any expertise. To further improve the search, recent publications propose transferring knowledge from previous experiments to new experiments. We adapt the sequential model-based optimization by replacing its surrogate model and acquisition function with one policy that is optimized for the task of hyperparameter tuning. This policy generalizes over previous experiments but neither uses a model nor uses meta-features, nevertheless, outperforms the state of the art. We show that a static ranking of hyperparameter combinations yields competitive results and substantially outperforms a random hyperparameter search. Thus, it is a fast and easy alternative to complex hyperparameter tuning strategies and allows practitioners to tune their hyperparameters by simply using a look-up table. We made look-up tables for two classifiers publicly available: SVM and AdaBoost. Furthermore, we propose a similarity measure for data sets that yields more comprehensible results than those using meta-features. We show how this similarity measure can be applied to surrogate models in the SMBO framework and empirically show that this change leads to better hyperparameter configurations in less trials.
Keywords :
"Tuning","Optimization","Data models","Adaptation models","Loss measurement","Machine learning algorithms"
Publisher :
ieee
Conference_Titel :
Data Mining (ICDM), 2015 IEEE International Conference on
ISSN :
1550-4786
Type :
conf
DOI :
10.1109/ICDM.2015.20
Filename :
7373431
Link To Document :
بازگشت