CONF
Sivaprasad_ICML2020_2020/IDIAP
Optimizer Benchmarking Needs to Account for Hyperparameter Tuning
Sivaprasad, Prabhu Teja
Mai, Florian
Vogels, Thijs
Jaggi, Martin
Fleuret, Francois
Benchmarking
Hyperparameter optimization
optimization
EXTERNAL
https://publications.idiap.ch/attachments/papers/2020/Sivaprasad_ICML2020_2020.pdf
PUBLIC
https://publications.idiap.ch/index.php/publications/showcite/Sivaprasad_Idiap-RR-19-2019
Related documents
Proceedings of the 37th International Conference on Machine Learning
Vienna, Austria
2020
https://icml.cc/Conferences/2020/Schedule?showEvent=6589
URL
The performance of optimizers, particularly in deep learning, depends considerably on their chosen hyperparameter configuration. The efficacy of optimizers is often studied under near-optimal problem-specific hyperparameters, and finding these settings may be prohibitively costly for practitioners. In this work, we argue that a fair assessment of optimizers' performance must take the computational cost of hyperparameter tuning into account, i.e., how easy it is to find good hyperparameter configurations using an automatic hyperparameter search. Evaluating a variety of optimizers on an extensive set of standard datasets and architectures, our results indicate that Adam is the most practical solution, particularly in low-budget scenarios.
REPORT
Sivaprasad_Idiap-RR-19-2019/IDIAP
On the Tunability of Optimizers in Deep Learning
Sivaprasad, Prabhu Teja
Mai, Florian
Vogels, Thijs
Jaggi, Martin
Fleuret, Francois
Benchmarking
Hyperparameter optimization
optimization
EXTERNAL
https://publications.idiap.ch/attachments/reports/2019/Sivaprasad_Idiap-RR-19-2019.pdf
PUBLIC
Idiap-RR-19-2019
2019
Idiap
December 2019
Under review at ICLR 2020
https://arxiv.org/abs/1910.11758
URL